As we're talking about pipelining bugs, I think it's essential to understand pipelining.
According to an article from techopedia, Pipelining is the
process of accumulating and executing computer instructions and tasks from the
processor via a logical pipeline. It allows storing, prioritizing, managing,
and executing tasks and instructions in an orderly process. However, sometimes
different bugs affect them that affect the computer's behaviors. An example of
how a pipelining bug led to unexpected computer usage and design behaviors is
Pentium chips failing math. In 1994, an entire line of CPUs by market leaders
couldn't do their math. The Pentium floating-point flaw ensured that no matter
what software you used, your results stood a chance of being inaccurate past
the eighth decimal point. The problem lay in a faulty math coprocessor, a
floating-point unit. The result was a slight possibility of tiny errors in
hardcore calculations, but it was a costly PR debacle for Intel.
Things went wrong because the lookup table consisted of
1,066 table entries downloaded into the programmable logic array of the chip.
But only 1,061 entries made it onto the first-generation Pentiums; five got
lost on the way.
Intel's laudable idea
was to triple the execution speed of floating-point calculations by ditching
the previous-generation 486 processor's clunky shift-and-subtract algorithm and
substituting a lookup-table approach in the Pentium, but unfortunately, things
did go as planned.
In January 1995, Intel announced a pretax charge of $475
million against earnings, most of which stemmed from replacing flawed
processors. The bottom line in this arithmetic mess is this: In lookup-table
and money calculations, 1,066 – 5 = –$475,000,000. Any way you look at it,
that's bad math.
Comments
Post a Comment
Comments: