ChuckMcM 5 days ago

This is an update from the folks doing Thunderscope (a high frequency "soft" oscilloscope) but this really stands out:

Teaching KiCad a New Trick - Matching Delays

At time of writing, KiCad only understands the length of traces and pins. When length matching, it takes length as a single number added up across every layer. This leads to delay mismatches, as the signals on the inner layers are slower than the signals on the outer layers. When assigning pin lengths, you need to arbitrarily choose a layer to convert a delay value (given by the manufacturer), to a length. This also results in delay mismatches.

I wanted to do this right, just like Altium does, but I didn’t want to have to calculate and add up all the delay values by hand in a spreadsheet. So I made a script to rewrite custom design rules to try to get KiCad’s length matching to be delay matching (including pad delays).

Closed source design tools leave you stuck, and often when a need like this surfaces you end up paying a lot of money for an "option pack" that adds the capability. If you have ever wondered if KiCad was up to doing any kind of design, this should assure you that no it works just fine and you can kick that $10,000 Altium license to the curb.

  • dcrazy a day ago

    Plenty of closed source software packages are extensible and scriptable. The entire 3D modeling industry is a great example.

    • rowanG077 a day ago

      Only up to a point. There is a ton of scenarios where you simply run up to the API limitations.

    • Mbwagava 21 hours ago

      Why haven't people bludgeoned them into opening their software if its so useful? This is inevitable; in fact you could measure the long-term efficiency of an industry by how quickly this happens. Movie studios are famously bad at spending money (yes there are exceptions, but they can be counted on one hand).

      • dcrazy 14 hours ago

        Because most people are much more concerned about doing their jobs than having access to source code they aren’t going to look at, much less modify.

  • LordShredda a day ago

    The quote does say that Altium does it out of the box though? With KiCad you had to write a script and learn how the format works. Could also be introduced as a patch to upstream and have everyone benefit from it, but that's less time spent working on the circuit

    • explodingwaffle a day ago

      This is coming in the next release of Kicad: https://forum.kicad.info/t/post-v9-new-features-and-developm...

      The rate of development since V6 is crazy fast IMO. Very much an OSS success story.

      • crote 10 hours ago

        It's absolutely insane. Kicad v5 was usable, if you wanted to make simple projects and were willing to deal with frequently running into annoyances. Kicad v6 took forever to release, but it suddenly went from "an option for hobbyists who can't afford EAGLE / Altium" to "viable tool for not-too-complicated professional products". Ever since then every release has been filled with quality-of-life improvements - both huge improvements and fixes for small annoyances.

        We saw something similar with Blender. At a certain point it becomes good enough that for some professionals it becomes a viable alternative to its obscenely expensive proprietary competition. If those companies are willing to donate $500 / seat / year to OSS instead of spending $1500 / seat / year on proprietary licensing, they can get some developer to fix the main issues they run into. This in turn means the OSS variant gets even better, which means even more companies are willing to consider switching, which means even more budget for development. Let this continue for a few years, and the OSS alternative has suddenly become best-in-class.

    • ericwood a day ago

      But you have recourse! It’s not ideal, but it beats being at the mercy of a vendor in most cases. Trying to hack around a closed source format is an even bigger drain.

    • bsder a day ago

      > The quote does say that Altium does it out of the box though?

      Sure, if you're routing 8+ layer boards with blind vias and PCIx16 and DDR5 buses every day, go buy an Allegro or Expedition licence for 6 figures. It's absolutely worth the money.

      For Altium, I find that the "showstopper bug that Altium has":"feature that Kicad doesn't have" ratio is almost always strongly in favor of Kicad.

gitroom a day ago

Nice progress here, always gets my brain going about open source vs paid tools. Real question for me - you think having that control and flexibility is worth jumping through more hoops up front or not worth it if time's tight?

IshKebab 18 hours ago

I don't see "why open source is better" mentioned anywhere? If anything he says that Altium already had the correct behaviour but he had to write a script to work around it in Kicad.

  • f_devd 14 hours ago

    He switched from Altium in the current version, I think the title claim is that even though KiCAD doesn't have it (yet), it can be added without external blockers because it's open source.

    The title doesn't match the article title though, so unless the author and OP are the same it's a bit weird.

amelius a day ago

Regarding the Thunderscope project, I wonder if it would be feasible one day to have a scope that can sample USB3 signals.

(Actually, a device that can measure bit error rates would be great too).

  • tverbeure an hour ago

    For eye diagrams, you can make do with a sampling oscilloscope instead of a regular single-shot oscilloscope. My favorite presentation at Hackaday Supercon 2019 was this one: https://www.youtube.com/watch?v=99u53V7uDFY.

    Unfortunately, the author burned out on it and project is dead. But the presentation is still worth watching.

  • LiamPowell a day ago

    I assume you mean decoding as sampling is just a matter of bandwidth. There are already decoders for USB 3: https://www.keysight.com/au/en/product/D9010USBP/usb-3-x-pro...

    • amelius 12 hours ago

      "just" :)

      Yes, I mean sampling. I want to see the eye-diagrams preferably, using a DIY device. It should be possible, perhaps using delay-lines (as now on the HN frontpage).

      Also, if a device like this exists, then maybe someone can write an open-source tool to compute the bit-error-rate from digital inputs. Or write some Wireshark extension to do decoding of raw signals.

      This Keysight company makes nice tools but they are out of reach of hobbyists and small companies, and cheaper tools should be possible since we're all having USB3 devices in our computers already (digital ones, ok).

  • Mbwagava a day ago

    Tbh, I am surprised this isn't possible. One would assume sampling a signal is the fundamental property of working with hardware.

    • crote 9 hours ago

      A device communicating over USB3 has to sample once per symbol period, with a 1-bit frequency. Most of the hard stuff is offloaded to purpose-designed silicon, so that 5Gsps analog signal is quickly turned into, say, a 128-bit bus running at 39MHz. That's fairly easy to deal with - especially because most of it is either directly processed and forwarded (like webcams), or has some form of flow control (hard drives).

      If you're diagnosing signal quality you're going to want to look at the analog signals, which means sampling at a rate significantly faster than the baud rate, and at a 8-bit or higher resolution to actually see analog behaviour. Suddenly you're dealing with 400Gbps of incoming sampling data - and you have to do realtime analysis on that to trigger at the right time, and be capable of storing at least a few tens of thousands of samples for display.

  • XorNot a day ago

    When being able to do this cheaply for 1G Ethernet would be incredibly useful.

    • crote 9 hours ago

      That should already be possible with hobbyist-level equipment, no? 1000BASE-T has a bandwidth of 62.5MHz and communicates at 125 megabaud. Something like the $500 Siglent SDS804X HD should be capable of handling that.

rkagerer a day ago

Would love to see a sample of the software when available. I wonder if it could supplement / replace my Saleae logic probe (which includes some analog channels).

LeonM a day ago

> Test Rev. 5: This should take no longer than two weeks.

Ah yes, the famous last words of expecting testing to take less than two weeks, and that all tests will pass...

  • LeonM 21 hours ago

    My post is being downvoted, but it was serious advice though.

    Especially in RF hardware design, you will have to plan for the hardware revision to inevitably have problems. And in hardware design, a new revision will take at least another week for a new prototype to arrive.

    OP is on rev 5, so I'm assuming that the schematics itself will have been validated already, if the schematics haven't changed between v4 and v5 then it's not unrealistic to subtract the schematic validation part from the planning.

    However, OP does also mention having made many routing / placement changes, and trying to move components under a heatsink and such. This is where all sorts of unforseen problems can arise. Especially with high-speed, RF, impedance matched design you can run into so many unforeseen RF black-magic problems. Trust me, I've been there.

    In hardware, especially when RF is involved, it's not about how long the testing/validation itself takes, but the turnaround time to get a new prototype produced.

    • datadrivenangel 14 hours ago

      I also had the same thought when I read that.

      It does seem like the schedule question here is not if testing takes two weeks, it's if rev 5.1 actually fixes the issues, and how long testing revs 5.2 and 5.3 will inevitably take.

  • Mbwagava a day ago

    Planning for and expecting are not the same thing. This is just a bad-faith interpretation.

  • bsder a day ago

    "We do these things not because they are easy, but because we thought they were easy."