This is a follow-up to Logic Transformation problem.
We are taught that computers work using boolean (i.e. mathematical) logic. And that boolean logic has two
values (truth
and false
) along with a number of operations (and
, or
, not
etc).
Unexpected for many, computers do not work exactly this way. They use logic implemented in electronics, and this
induces a number of funny effects. There are things like strong (good) false
and weak (poor) truth
. There is
popular high-Z
state. And in domain of time there are glitches
. Let's look closer at them!
Consider logic schematic shown on the left. Circles (on input or output) mean inversion. So here single
signal X
is connected to inputs of AND
block, but to one of them it gets through dedicated NOT
block.
So the formula is X and not(X)
. Well. This sounds ridiculous, at least in mathematical sense. Output should
never be True
since one of inputs of AND
is always False
- right?
Not exactly. Real logical elements (i.e. implemented in electronics) have some delay in propagating signal
through them. If we apply signal X
to the input of NOT
, the ~X
value shall appear on its
output few nanoseconds later! This seems to be a triffle, but changes situation dramatically.
Look at the signal-time charts on the right of the image. Here we used t
for internal signal (output of NOT
).
Since we have this delay, on switching X
from 0
to 1
(we call it rising edge
, or front edge, or even
just "front" etc) we have the few nanoseconds of two 1
s on inputs of AND
block. And hence - short-living logic
True
on the output.
That's what we call Glitch
. It could be undesirable (if the output controls inflation of emergency "airbags"
in a car, for example). On the other hand computers heavily utilize this phenomenon to detect "edges" of the signal.
All changes in logic states in processor, memory etc are triggered by edges coming from "CPU clock" - thus
every chip needs to have such edge detector somewhere inside.
Chart for Y
is slightly inaccurate as AND
should also impose small delay, but that isn't important for this case.
Consider some complicated logic function, described exactly in the same way as in previous task. The goal is
to find out which input toggles shall lead to glitches on output. As there are 4
inputs, we regard all 16
possible input states. For every of them let's try to toggle every single bit - so there should be 64
cases
(we don't try to toggle more than one input simultaneously).
Let's agree that delay is the same in every single function, and doesn't depend on whether inputs are inverted or not.
Description of the glitches found shall be like this:
12.b->0101
First the value in range 0..15
describes initial state of inputs
(e.g. write down binary number with state of a
input as lowest bit and d
as highest, then conver it to decimal).
Then the input to toggle (a..d
) follows, separated with dot.
At last after ->
symbol there is a sequence of 0
s and 1
s, showing how signal changes. The first digit
is the initial state of the output, the last - resulting state. Of course if the sequence consists of only one
or two digits, this means there was no glitch - we are not interested in such cases.
Thus the example above means we had a=0, b=0, c=1, d=1
and toggle b
(to 1
) - which results in output
switching from 0
to 1
but with couple of glitches during the change.
Input: exactly the same as in the previous task.
Answer: should give states and toggles which lead to glitches (along with these glitches). Please, have it ordered first by value representing inputs, then by the bit to be toggled.
Example
input:
7 k
f := b xor ~e
i := h and g
g := d or ~a
j := ~i and b
k := h xor j
h := ~d and f
e := c or ~d
answer:
0.b->0101 1.b->010 2.d->101 3.b->010 4.b->0101 5.b->010 6.d->101 7.b->010 8.d->010 9.d->010 10.d->101 14.d->101