VHDL's Crown Jewel (sigasi.com)

by cokernel_hacker 62 comments 149 points
Read article View on HN

62 comments

[−] e7h4nz 47d ago
The Delta Cycle logic is actually quite similar to functional reactive programming. It separates how a value changes from when a process responds to that change.

VHDL had this figured out as early as 1987. I spent many years writing Verilog test benches and chasing numerous race conditions; those types of bugs simply don't exist in VHDL.

The Verilog rules—using non-blocking assignments for sequential logic and blocking assignments for combinational logic—fail as soon as the scenario becomes slightly complex. Verilog is suitable when you already have the circuit in your head and just need to write it down quickly. In contrast, VHDL forces you to think about concurrent processes in the correct way. While the former is faster to write, the latter is the correct approach.

Even though SystemVerilog added some patches, the underlying execution model still has inherent race conditions.

[−] thesz 46d ago

  > The Delta Cycle logic is actually quite similar to functional reactive programming. It separates how a value changes from when a process responds to that change.
This is what I use when I play with hardware simulation in Haskell:

  type S a = [a]

  register :: a -> S a -> S a
  register a0 as = a0:as

  -- combinational logic can be represented as typical pure
  -- functions and then glued into "circuits" with register's
  -- and map/zip/unzip functions.
This thing also separates externally visible events recorded in the (infinite) list of values from externally unobservable pure (combinational) logic. But, one can test combinational logic separately, with property based testing, etc.
[−] tverbeure 47d ago
I used to be a huge VHDL proponent, talk about the delta cycle stuff, give VHDL classes at work to new college grads and such. And then I moved to the West Coast and was forced to start using Verilog.

And in the 21 years since, I’ve never once ran into an actual simulation determinism issues.

It’s not bad to have a strict simulation model, but if some very basic coding style rules are followed (which everybody does), it’s just not a problem.

I don’t agree at all with the statement that Verilog fails when things become too complex. The world’s most complex chips are built with it. If there were ever a slight chance that chips couldn’t be designed reliably with it, that could never be the case.

Anyway, not really relevant, but this all reminds me of the famous Verilog vs VHDL contest of 1997: https://danluu.com/verilog-vs-vhdl/

[−] e7h4nz 47d ago
On a practical level, you're right, most of my team's work is done in Verilog.

That being said, I still have a preference for the VHDL simulation model. A design that builds correctness directly into the language structure is inherently more elegant than one that relies on coding conventions to constrain behavior.

[−] formerly_proven 47d ago
This actually sounds a bit like a C/C++ argument. Roughly: Yes, you can easily write incorrect code but when some basic coding conventions are followed, UAF/double free/buffer overflows/... are just not a problem. After all, some of the world's most complex software is built with C / C++. If you couldn't write software reliably with C / C++, that could never be the case.

I.e. just because teams manage to do something with a tool does not mean the tool didn't impede (or vice versa, enable) the result. It just says that it's possible. A qualitative comparison with other tools cannot be established on that basis.

[−] lifis 47d ago
I don't understand this: isn't the thing in the article only relevant for software simulations, while in hardware ordering is arbitrary like in Verilog, or at least dependent on wire lengths that are not specified in HDL? (unless you delay the effect to the next clock update, which it seems to me will work the same in all HDLs and targets).

And afaik HDLs are almost exclusively used for hardware synthesis, never seen any software written in those languages.

So it doesn't seem important at all. In fact, for software simulation of hardware you'd want the simulation to randomly choose anything possible in hardware, so the Verilog approach seems correct.

[−] Taniwha 47d ago
I'm a long time verilog user (30+ years, a dozen or so tapeouts), even written a couple of compilers so I'm intimate with the gory details of event scheduling.

Used to be in the early days that some people depended on how the original verilog interpreter ordered events, it was a silly thing (models would only run on one simulator, cause of lots of angst).

'<=' assignment fixed a lot of these problems, using it correctly means that you can model synchronous logic without caring about event ordering (at the cost of an extra copy and an extra event which can be mostly optimised away by a compiler).

In combination 'always @(*)' and '=', and assign give you reliable combinatorial logic.

In real world logic a lot of event ordering is non deterministic - one signal can appear before/after another depending on temperature all in all it's best not to design depending it if you possibly can, do it right and you don't care about event ordering, let your combinatorial circuits waggle around as their inputs change and catch the result in flops synchronously.

IMHO Verilog's main problems are that it: a) mixes flops and wires in a confusing way, and b) if you stay away from the synthesisable subset lets you do things that do depend on event ordering that can get you into trouble (but you need that sometimes to build test benches)

[−] Taniwha 46d ago
BTW my really big peeve about modern verilog is that it never picked up {/} as synonyms for begin/end - my experiments (20 years ago) showed that it was an easy extension, the minor syntactic ambiguities were trivally fixable
[−] gsmecher 46d ago
I love that VHDL formalizes Verilog's pragmatic blundering, but emphasizing delta-cycle ordering is "inside baseball" and IMO bad marketing. VHDL's approach is conceptually clean, but from a practical perspective, this ordering doesn't (and shouldn't) matter.

Better to emphasize the type system, which make a durable and meaningful difference to users (both experienced and new). My go-to example is fixed-point arithmetic: for VHDL, this was an extension to the IEEE libraries, and didn't require a change to the underlying language (think of how c++'s std:: evolves somewhat separately from compilers). Verilog's type system is insufficiently expressive to add fixed-point types without changes to the language itself. This positions VHDL better for e.g. low-precision quantization for AI/ML.

In any case, the VHDL/Verilog language wars are over, and while VHDL "lost", it's clear the victory was partly Pyrrhic - RTL probably has a polyglot future, and everyone's waiting (with mixtures of resignation and hope, but very little held breath) for something better to come along.

[−] oelang 46d ago
VHDL mostly lost the ASIC consumer market and for some that's the only market that matters, but the hardware design ecosystem is much bigger than that.

I wonder what AI will do to RTL/verification, the rigid nature of VHDL may be a better target for AI than Verilog.

[−] buildbot 47d ago
Naively as a West Coast Verilog person, VHDL Delta cycles seem like a nice idea, but not what actual circuits are doing by default. The beauty and the terror of Verilog is the complete, unconstrained parallel nature of it’s default - it all evaluates at t=0 by default, until you add clocks and state via registers. VHDL seems easy to create latches and other abominations too easily. (I am probably wrong at least partially.)

((Shai-Hulud Desires the Verilog))

[−] tverbeure 47d ago
AFAIK, creating latches is just as easy in Verilog as in VHDL. They use the same model to determine when to create one.

But with a solid design flow (which should include linting tools like Spyglass for both VHDL and Verilog), it’s not a major concern.

[−] nickelpro 46d ago
SystemVerilog basically fixes this with always_comb vs always_latch.

There's no major implementation which doesn't handle warning or even failing the flow on accidental latch logic inside an always_comb.

[−] latenode 46d ago
VHDL gets treated like a legacy language nobody wants to touch but the people who actually use it tend to be very serious about why they still do.
[−] SilverBirch 47d ago
Needs a [2010] tag. In almost all modern hardware development you'll have coding guidelines along the lines of "Always use blocking assignments for comb logic, always use non-blocking for sequential logic". You end up back at the same place as VHDL, by nature SystemVerilog is much weaker typed than VHDL. So you have to just have conventions in order to regain some level of safety.
[−] seertaak 46d ago
Interesting article; I've always been fascinated and intimidated by FPGA programming - it's one of the few remaining "dark arts" of software engineering.

> VHDL’s delta cycle algorithm is its crown jewel. It gives you built-in determinism. Let us treasure it - Verilog doesn’t have anything like it. At the same time, you will agree with me that there is nothing too complicated about the concept.

I agree with you -- and thank you!

[−] CorrectHorseBat 47d ago
The real question is, why do we even need this? Why don't VHDL and Verilog just simulate what hardware does? Real hardware doesn't have any delta cycles or determinism issues due to scheduling. Same thing with sensitivity lists (yes we have */all now so that's basically solved), but why design it so that it's easy to shoot in your own foot?
[−] SilverBirch 47d ago
What do you mean by simulate? Do you want the language to be aware of the temperature of the silicon? Because I can build you circuits whose behaviour changes due to variation in the temperature of the silicon. Essentially all these languages are not timing aware. So you design your circuit with combinatorial logic and a clock, and then hope (pray) that your compiler makes it meet timing.

The fundamental problem is that we're trying to create a simulation model of real hardware that is (a) realistic enough to tell us something reasonable about how to expect the hardware to behave and (b) computationally efficient enough to tell us about a in a reasonable period of time.

[−] ithkuil 47d ago
"All sensors are temperature sensors, some measure other things as well"
[−] audunw 47d ago
The only way to simulate what real hardware does is to synthesise the design, get a net list and do a gate level simulation. This is incredibly slow, both to compile and to simulate.

You could, of course, simplify the timing model a lot. In the end you get down to “there is some time passing for the signal to get through this logic, we don’t know how much but we assume it’s less than any clock period”.. in which case we end up with delta cycles.

[−] CorrectHorseBat 46d ago
Real hardware has clock trees. Wouldn't all (most?) problems with delta cycles go away if the HDL understood the concept of clocks and clock balancing?
[−] tverbeure 47d ago

> Why don't VHDL and Verilog just simulate what hardware does?

Real hardware has hold violations. If you get your delta cycles wrong, that's exactly what you get in VHDL...

They're both modeling languages. They can model high-level RTL or gate-level and they can behave very different if you're not careful. "just simulation what the hardware does" is itself an ambiguous statement. Sometimes you want one model, sometimes the other.

[−] artemonster 47d ago
Draw yourself an SR latch and try simulating. Or a circuit what is known as „pulse generator“
[−] hardolaf 46d ago
Both SystemVerilog and VHDL have AMS extensions for simulating analog circuits. They work pretty well but you also pay a pretty penny for the simulator licenses for them.
[−] CorrectHorseBat 47d ago
Those are analog circuits, if you put them in your digital design you are doing something wrong.
[−] artemonster 46d ago
dont know if trolling. SR latch you can do with 2 NANDs, or NORs there are plenty of *digital* circuits with that functionality, and yes, there are very rare cases when you construct this out of logic and not use a library cell for this. pulse circuit is AND(not(not(not(a))),a) also rarely used but used nonetheless. to properly model/simulate them you would need delta cycles
[−] kryptiskt 46d ago
All circuits are analog when physically realized, the digital view is an abstraction.
[−] variadix 46d ago
I’m not exactly sure what you’re getting at, but I think I’ve had a similar question: why don’t HDLs have language elements more representative of what digital circuits are constructed from, namely synchronous and asynchronous circuits, rather than imperative input triggered blocks (processes IIRC, it’s been a while)?

I always thought it was confusing to design a circuit mentally (or on paper) out of things like muxes, encoders, flip flops, etc. and not have language-level elements to represent these things (without defining your own components obviously).

I remember looking this up, and I believe it’s because the languages were originally designed for simulation and verification, and there are things you might want to do in a simulation/verification language for testing that are outside of what the hardware can do. Mixing the two is confusing IMO, but clearly demarcating the hardware-realizable subset of the language would be better than the current state.

[−] jeffreygoesto 47d ago
Reminds me a lot of "Logical Execution Time" and the work of Edward Lee ("The Problem With Threads") for a software equivalent. Determinism needs sparation of computation from communication.
[−] arianvanp 47d ago
Sounds like reachability problem in Petri nets to me?
[−] xihe-forge 47d ago
[dead]
[−] artemonster 47d ago
Please stop bickering about verilog vs vhdl - if you use NBAs the scheduler works exactly the same in modern day simulators. There is no crown jewel in vhdl anymore. Also type system is annoying. Its just in your way, not helping at all.