Close
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
Andrew J. and Erna Viterbi Family Archives
/
Viterbi: Presentations
/
Jim Omura, "Reflections on Andrew Viterbi and his Algorithm," March 8, 2005.
(USC DC Other)
Jim Omura, "Reflections on Andrew Viterbi and his Algorithm," March 8, 2005.
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
The contents of this folder were not donated by Dr. Andrew Viterbi. They were printed
from different sources by Michael Hooks, the Viterbi Family Archivist, who created this
folder. The source of each set of print-outs is provided with the appropriate presentation.
USC Viterbi School of Engineering : 2005 Agenda http://viterbi.usc.edu/news/events/viterbi_lecture/2005_agenda. htm
1 of 3
Home> News & Publications> Events Calendar> Viterbi Lecture> 2005 Agenda
Viterbi Conference:
Advancing Technology through Communications Sciences
March 8-9, 2004 (2005]
Schedule of Events
Event Summary
Tuesday. March 8
8:00-9:00 Registration and Continental Breakfast (Davidson Conference Center)
9:00-5:00: Technical Program (Davidson Conference Center)
12:00-1 :30 Lunch (Vineyard Room)
5:00-6:00 Reception (Lobby of Davidson Conference Center)
6:00-7:00 Viterbi Distinguished Lecture (Embassy Room)
Jacob Ziv
What is Hidden in an Individual Sequence?
Universal Data-Compression Revisited
Wednesday. March 9
8:00-9:00 Continental Breakfast
9:00-12:00 Technical Program (Embassy Room: Davidson Conference Center)
12:00-1 :30 Lunch (Vineyard Room)
1:30-2:00 Andrew Viterbi, USC Presidential Chair Professor of Electrical Engineering
Reminiscences and Closing Remarks
3:00-4:00 Communication Sciences Institute Research Overview and Student Poster Session
(Club Room: Davidson Conference Center)
Technical Program: Day 1
Tuesday. March 8
8:00-9:00 Registration and Continental Breakfast
9:00-9:30 C. L. Max Nikias, Dean of the USC Viterbi School of Engineering
Welcoming Remarks
Technical Session I: The Viterbi Algorithm
Session Chair: G. David Forney, Jr.
11/14/2006 10:01 AM
USC Viterbi School of Engineering: 2005 Agenda http://viterbi.usc.edu/news/events/viterbi_lecturel2005_agenda. htrn
I 20f3
9:30-10:15 Plenary Talk
G. David Forney, Bernard M. Gordon Adjunct Professor, MIT
History of the Viterbi Algorithm
10:15-11:00 Break
11:00-11 :30 Dr. Jim Omura
Reflections on Andrew Viterbi and his Algorithm
11:30-12:00 Paul Siegel, Professor, Electrical & Computer Engineering, UCSD
Applications of the Viterbi Algorithm in Data Storage Technology
12:00-1:30 Lunch (Vineyard Room)
Technical Session II: Space Communications
Session Chair: William C. Lindsey
1:30-2:15 Plenary Talk
William C. Lindsey, Professor, Electrical Engineering, USC
Andrew J. Viterbi's Contributions to Space Communications: A Systems Perspective
2:15-3:00 Break
3:00-3:30 Robert J. McEliece.
Allen E. Puckett Professor, Electrical Engineering, Caltech
Viterbi's Impact on the Exploration of the Solar System
4:00-4:30 Dr. William W. Wu, Founder & CTO, ATMco
Andrew Viterbi's Impact in International Satellite Communications
4:30-5:00 Dr. Bernard Sklar, Director of Advanced Systems Communications, Engineering
Services
What did Andy Viterbi do for Communications? Let Me Count the Ways
5:00-6:00 Reception (Lobby of Davidson Conference Center)
6:00-7:00 Viterbi Distinguished Lecture (Embassy Room)
Dr. Jacob Ziv
What Hidden in an Individual Sequence?
Universal Data-Compression Revisited
Technical Program: Day 2
Wednesday. March 9
8:00-9:00 Continental Breakfast
9:00-9:20 Solomon Golomb, Andrew and Erna Viterbi Professor, Electrical Engineering USC
Reminisces on Andrew Viterbi and his impact
Technical Session III: Code division Multiple Access (COMA)
Session Chair: Roberto Padovani, CTO, Qulacomm, Inc
9:20-10:05 Plenary Talk
Roberto Padovani, CTO, Qualcomm, Inc.
Ten years of progress in COMA.
10:05-10:30 Break
10:30-11 :00 Robert A. Scholtz, Fred H. Cole Professor, Electrical Engineering, USC
COMA in Retrospect
11:00-11 :30 Sergio Verdu, Professor, Electrical Engineering, Princeton
Multiuser Detection: A Historical Account
11:30-12:00 David Tse, Professor, Electrical Engineering & Computer Science, UC Berkeley
COMA vs Opportunistic Communication: A Tale of Two Views
12:00-1 :30 Lunch (with remarks by C. L. Max Nikias)
11/14/2006 10:01 AM
USC Viterbi School of Engineering: 2005 Agenda http://viterbi.usc.edu/news/events/viterbi_IectureI2005_agenda.htm
1:30-2:00
2:00-2:30
2:30-3:00
3:00-4:00
Andrew Viterbi,
USC Presidential Chair Professor of Electrical Engineering
Reminiscences and Closing Remarks
Break
Keith Chugg and Urbashi Mitra, Director(s), Communication Sciences Institute,
Electrical Engineering, USC
Current Communications Research at USC
Communication Sciences Institute (CSI) Student Poster Session
30f3
General Chair: Solomon Golomb
Organizing Committee:
Annette Blain (external relations)
Mayumi Thrasher (arrangements)
Milly Montenegro (registration)
Gerielyn Ramos (facilities)
Technical Program Committee:
Keith M. Chugg
G. David Forney, Jr.
P. Vijay Kumar
William C. Lindsey
Urbashi Mitra
Roberto Padovani
Robert A. Scholtz
Jack K. Wolf
Corporate Sponsors:
11/1412006 10:01 AM
USC Viterbi School of Engineering: Viterbi Lecture http://viterbi.usc.edu/viterbiconference/
10f2
Home> News &Publications> Events Calendar> Viterbi Lecture
2005 Viterbi Conference & Lecture
Sponsored by Qualcomm
Viterbi Conference
Advancing Technology through Communications Sciences
The Viterbi Conference will be held at the University of Southern
California on March 8 and 9, 2005. This technical symposium
will feature fifteen distinguished speakers from areas where Dr.
Andrew Viterbi has made a significant impact. The speakers will
provide a mixture of technical, historical and anecdotal material. The Viterbi Conference coincides
with the 2005 Viterbi Lecture. The 2005 Viterbi lecture will be given by Jacob Ziv, Distinguished
Professor of Electrical Engineering at the of the Technion Israel Institute of Technology, on the
evening of March 8. In addition, the University of Southern California will be dedicating the Viterbi
Museum in the new Ronald Tutor Hall of Engineering. Finally, the two days are an opportunity to
honor Professor Viterbi's contributions on the occasion of his 70th birthday.
Viterbi Lecture
The Viterbi Lecture was created as the USC Viterbi School of Engineering's premier academic
distinction in information technology and digital communications, an area of research in which the
School is a national leader. Each year, an awardee who has made fundamental contributions of
profound impact in communication will present the Viterbi Lecture.
2005 Viterbi Conference and Lecture
March 8 and 9, 2005
• Tentative Agenda
• Abstract and Biographv Information for 2005 Viterbi Lecturer Dr. Jacob Ziv
• Speaker's Biographies
• Hotel Information
• Conference Registration Brochure
11/14/2006 10:06 AM
USC Viterbi School of Engineering: Viterbi Lecture
Previous Lecturers:
• 2003: G. David Forney, Jr., UNot your father's coding theory"
• 2002: Andrew J. Viterbi (inaugural lecture)
2of2
http://viterbi.usc.edulviterbiconference/
1111412006 10:06 AM
G. David Forney, Jr., "The Viterbi Algorithm: A Personal History," presented at the
Viterbi Conference: Advancing Theory through Communications Sciences, March 8-9,
2005, at the University of Southern California. This copy was printed from
http://arxiv.org/PS cache/cs/pdf/0504/0504020v2.pdf.
[The copy of this paper was added to the Viterbi Family Archives by Michael Hooks,
Viterbi Family Archivist, and was not part of the original donation by Dr. Andrew
Viterbi.]
The Viterbi Algorithm: A Personal History
G. David Forney, Jr.
MIT
Cambridge, MA 02139 USA
forneyd~comcast.net
Abstract
The story of the Viterbi algorithm (VA) is told from a personal perspective. Applications
both within and beyond communications are discussed. In brief summary, the VA has proved
to be an extremely important algorithm in a surprising variety of fields.
1 Introduction
Andrew J. Viterbi is rightly celebrated as one of the leading communications engineers and
theorists of the twentieth century. He has received almost every professional award possible,
including election not only to the National Academy of Engineering (USA) but also to the
National Academy of Sciences (USA), where he chairs the Computer Science section. His award
citations usually cite "invention of the Viterbi algorithm" as his most notable accomplishment.
On the other hand, Andy would be the first to tell you that other people deserve much of the
credit for recognizjng its theoretical properties and its practical attractiveness, and for extending
its domain of application. He has often told this story himself (see, e.g.) [33]).
Nevertheless, no one doubts that Andy's awards are entirely deserved, and that their focus on
the Viterbi algorithm (VA) is appropriate. This article will attempt to explain why, by briefly
recounting the history of the VA. It is a "personal history," because the story of the VA is so
intertwined with my own history that I can recount much of it from a personal perspective.
2 Invention of the Viterbi algorithm
The Viterbi algorithm was first presented in Andy's famous 1967 paper [30] to help prove an
asymptotically optimum upper bound on the error probability of convolutional codes, which had
previously been derived by Yudkin in the context of sequential decoding [37]. In this paper, the
VA is presented just as we understand it today. This paper introduces the important concept of
survivors (a term possibly borrowed from tennis elimination tournaments), and shows that only
qK survivors need be retained to decode a convolutional code with constraint length K over the
q-ary field GF(q). Compared to a block code with qK codewords, such a convolutional code is
shown to have a much better error exponent, particularly near capacity.
Andy recalls in a 1999 interview [22] that
the Viterbi algorithm for convolutional codes ... came out of my teaching .. " I
found information theory difficult to teach, so I started developing some tools. . .. I
wrote the first paper in March '66, but it wasn't published until April '67. . .. At
one point I was actually discouraged from publishing the algorithm details. FortuÂ
nately, one of the reviewers, Jim Massey, encouraged me to include the algorithm.
. . . Nobody thought that it had any potential for practical value ... "
It is clear from the paper that at this point Andy had no idea that the VA was actually
an optimum (maximum likelihood) decoder, nor that it was potentially practical. Indeed, the
paper states that "this decoding algorithm is clearly suboptimal," and concludes: "Although
this algorithm is rendered impractical by the excessive storage requirements, it contributes to a
general understanding of convolutional codes and sequential decoding through its simplicity of
mechanization and analysis" [30].
3 Discovery that the VA is optimum
I believe that I received a copy of Andy's paper prior to publication, probably via Jim Massey.
At that time I was working at Codex Corp., a small start-up company aiming at practical
applications of convolutional codes. Our primary focus was initially on threshold decoding,
which was the subject of Jim's doctoral thesis [21]; Jim was a consultant. Subsequently, we
developed a sequential decoding system [36] for the Pioneer deep-space satellite program, which
became the first code in space [5].
I had been trying to understand why in practice convolutional codes were generally superior
to block codes so I studied Andy's paper with great interest. I realized that the path-merging
property of convolutional codes could be depicted in what I called a trellis diagram, to contrast
with the then-conventional tree diagram used in the analysis of sequential decoding. It was then
only a small step to see that the Viterbi algorithm was an exact recursive algorithm for finding
the shortest path through a trellis, and thus was actually an optimum trellis decoder. I believe
that at that point I called Andy, and told him that he had been too modest when he asserted
that the VA was "asymptotically optimum."
These results were written up in a 1967 technical report [6] for NASA Ames Research Center.
They were not published in journal form until many years later, in [9] and [10].
Shortly afterward, in a paper submitted in May 1968 [23], Jim Omura observed that the VA was
simply the standard forward dynamic programming solution to maximum-likelihood decoding
of a discrete-time, finite-state dynamical system observed in memoryless noise. Beyond proving
optimality in a different way, he thus made the first connection between the VA and system and
control theory. It is interesting to speculate whether the history of the VA would have been
different if it had simply been called "dynamic programming" from the beginning.
At this point, none of us had recognized that the VA might be practical. Jim's paper concludes:
"... the decoding algorithm discussed here grows exponentially in complexity with constraint
length v and is therefore impractical for large v ...." More embarrassingly, in a 1970 IEEE
SPECTRUM paper [7] describing practical coding schemes for the space channel, I wrote:
2
Sequential decoding [is] the best-performing practical technique known for memÂ
oryless channels like the space channel, and will probably be the general-purpose
workhorse for these channels in the future ....
[The Viterbi algorithm] is competitive in performance with sequential decoding for
moderate error rates, but cannot achieve very low error rates efficiently. On the other
hand, it [is] capable of extremely high speeds (tens of megabits), where sequential
decoders become uneconomic. It therefore may find application in high-data-rate
systems with modest error requirements, such as digitized television.
4 Recognition that the VA is practical
Andy has always said that Jerry Heller was the first person to realize that the VA might be
practical. Jerry simulated the performance of short-constraint-Iength codes at the Jet Propulsion
Laboratory (JPL) in 1968-69 [13, 14], and found that with only a 64-state code he could obtain
a sizable coding gain, of the order of 6 dB.
In 1968, Andy Irwin Jacobs, and Len Kleinrock incorporated Linkabit Corp. in San Diego as a
vehicle to pool their consulting efforts and to obtain small government study contracts. All kept
their jobs as professors. In 1969, Jerry Heller was hired as Linkabit's first full-time employee.
Linkabit obtained some small Navy and NASA contracts, which enabled the construction of a
VA prototype in 1969-70. "It was a big monster filling a rack" [22].
The first IEEE Communication Theory Workshop in 1970 in St. Petersburg became famous as
the "coding is dead" workshop, after Ned Weldon and other speakers worried publicly that coding
theory had come to a dead end. But what I remember best from that session is Irwin Jacobs
standing up in the back row flourishing an integrated circuit (a 4-bit shift register, I believe)
and asserting that this represented the future of coding. He was quite right. (Unfortunately, by
this time Codex had made a business decision to get out of coding.)
By 1971, Linkabit had implemented a 2 Mb/s, 64-state Viterbi decoder. In a special issue
on coding of the IEEE TRANSACTIONS ON COMMUNICATION TECHNOLOGY in October 1971,
Heller and Jacobs [15] discuss this decoder and many practical issues in careful detail. They
compare the VA with sequential decoding, and conclude that the VA will often be preferable
because it can use quantized soft decisions easily, and is less sensitive to channel and equipment
variations. In the same issue Cohen, Heller and Viterbi [3] describe a system using orthogonal
convolutional codes and the VA for asynchronous multiple-access communications, and Viterbi
[32] introduces generating-function analysis techniques for the VA.
During the 19708, through the leadership of Linkabit and JPL, the VA became part of the
coding standard for deep-space communication, ultimately in a concatenated coding system
with a Reed-Solomon (RS) outer code. Linkabit developed a relatively inexpensive and flexible
VA chip, and the VA became a nice little business for Linkabit. It didn't hurt that the inventor
of the Viterbi algorithm was a Linkabit founder. The VA also began to be incorporated in many
other communications applications.
In the early 1990s, JPL built a 2
14
-state "Big Viterbi Decoder" (BVD) with 8192 parallel
add-compare-select (ACS) units, which operated at a rate of the order of 1 Mb/s [4]. As far as
I know, the BVD remains the biggest Viterbi decoder ever built.
3
When the primary antenna failed to deploy during the Galileo mission in 1992 JPL devised an
elaborate concatenated coding scheme involving a 2
14
-state rate-1/4 inner convolutional code and
a, set of variable-strength RS outer codes, and reprogrammed it into the spacecraft computers.
This scheme was able to operate within about 2 dB of the Shannon limit at a bit error probability
of less than 10-
6
, which was the world record prior to the advent of turbo codes [5].
5 The VA and intersymbol interference channels
In the late 1960s, Codex turned its attention to the voiceband modem business. Our firstÂ
g;eneration product was a single-sideband (SSB) 9600 bls modem with a so-called Class IV
or 1 - D
2
"partial response." About 1969, I recognized that the symbol correlation that was
thus introduced could be exploited by an ad hoc error correction algorithm, which was able to
improve the noise margin by about 2-3 dB. This little decoder extended the commercial life of
this marginal-performance modem by perhaps a year or two.
It took me a while to understand that I had in fact invented a maximum-likelihood sequence
detector for this modem. Over time, I realized that this was nothing more than the Viterbi
algorithm again, streamlined for the 1 - D
2
response. This led to a 1972 paper [8] that showed
that the VA could be used as a maximum-likelihood sequence detector for digital sequences in
the presence of intersymbol interference (lSI) and AWGN noise.
Meanwhile, Jim Omura had recognized independently at UCLA that the VA could be used
on intersymbol interference channels, because of their convolutional character [24]. Indeed,
a tantalizing hint in this direction appears in a book review by Andy Viterbi in 1970 [31].
After visiting UCLA, Hisashi Kobayashi further developed this idea, particularly for practical
applications in partial response modems and magnetic recording [18, 19].
The VA proved to be too complicated for general use as an equalizer on lSI channels. However,
it stimulated many suboptimal approximations, and analysis of its performance gave bounds on
the best possible performance of any sequence detector.
However, the VA did become standard in the related application of high-density magnetic
recording. In so-called PRML systems ("partial-response equalization with maximum-likelihood
sequence detection") [17], the magnetic recording channel is first equalized to a simple "partial
response" such as 1 - D
2
, and the resulting sequence is then detected by the VA, or by a
simplified version thereof, as Kobayashi had envisioned [18]. In retrospect, it seems possible
that my little SSB modem decoder was the first implementation of such a PRML scheme.
6 Trellis-coded modulation
After Gottfried Ungerboeck published his invention of trellis-coded modulation in 1982 [29]
the VA became the workhorse decoder for the next several generations of voiceband modems.
Ungerboeck extended trellis coding to multilevel constellations by constructing trellis codes in
which each branch of the trellis represents a subset of constellation symbols, rather than a single
symbol. By clever constellation partitioning and attention to distances between subsets, he was
able to obtain coding gains in the bandwidth-limited regime comparable to those that can be
obtained in the power-limited regime.
4
For example the V.32 mod m (1986) used an 8-state trellis code to obtain a coding gain of
about 3.5 dB while the later V.34 modem (1994) used 16 to 64-state trellis codes to obtain
coding gains of 4.0 to 4.5 dB [11].
7 Applications in mobile and broadcast communications
The mobile communications channel is subject to fading, bursts, and multiuser interference, and
is a much more difficult medium than the AWGN and linear Gaussian channels discussed above.
The designers of second-generation (2G) cellular systems used every tool available at the time
(early 1990s) to provide reliable communication on this difficult channel.
The CDMA system developed by Qualcomm uses a 2
8
-state, rate-1/3 convolutional code with
interleaved 64-orthogonal modulation, and of course a Viterbi decoder. The TDMA system
developed for GSM uses the VA not only to decode a 16-state, rate-1/2 convolutional code
but also for equalization. A soft-output Viterbi algorithm (SOVA) is often used in the latter
application [5].
VA decoders are currently used in about one billion cellphones, which is probably the largest
number in any application. However, the largest current consumer of VA processor cycles is
probably digital video broadcasting. A recent estimate at Qualcomm is that approximately 10
15
bits per second are now being decoded by the VA in digital TV sets around the world, every
second of every day [25].
8 General application to hidden Markov models
In 1973, I wrote a tutorial paper on the Viterbi algorithm for the PROCEEDINGS OF THE IEEE
[9] that has turned out to be my most cited paper by far. A recent search using Google Scholar
shows 734 citations, far more than the 181 for my next-most-cited reference.
One of the main points of that paper was that the VA can be applied to any problem that
involves detecting the output sequence of a discrete-time, finite-state machine in memoryless
noise-- i. e., to d€tection and pattern recognition problems involving hidden Markov models
(HMMs). Of course decoding of convolutional codes and sequence detection on lSI channels
were the main applications discussed in that paper.
During the 70s and 80s, the VA became widely used in a variety of pattern recognition problems
that could be described by HMMs, particularly for speech recognition; see [26]. Here the VA is
often used as the M-step of an EM algorithm, which also adjusts HMM parameters.
Indeed, a recent search of IEEE Xplore shows that most current IEEE references to the VA
occur in such 'fransactions as PATTERN ANALYSIS AND MACHINE INTELLIGENCE or SYSTEMS,
MAN AND CYBERNETICS, rather than in COMMUNICATIONS or INFORMATION THEORY. It seems
that everyone in these fields knows how to "Viterbi the data."
Finally, in the past decade, the VA has become widely used in much more distant fields such
as computational biology, e.g., to locate genes in DNA sequences. See for example [16], with its
Viterbi Exon-Intron Locator" (VEIL).
5
9 Related algorithms
In the past decade, the development of the field of "codes on graphs" and their related decoding
algorithms has led to a remarkable conceptual unification of a variety of detection and estimation
algorithms which have been introduced under various names for various applications.
In his 1996 dissertation, generalizing the earlier work of Gallager [12] and Tanner [27], Niclas
Wiberg [34, 35] developed the generic "sum-product" and "min-sum" decoding algorithms for
cycle-free graphs which may include both symbol (observable) and state (hidden) variables. For
trellis graphs, he showed that these reduce to the BCJR algorithm [2] and an algorithm equivalent
to the Viterbi algorithm, respectively. For capacity-approaching codes such as turbo codes and
low-density parity-check (LDPC) codes, the sum-product algorithm with an appropriate schedule
becomes the standard iterative decoding algorithm that is normally used with such codes.
Later authors (e.g.) [1,20)) have shown that the sum-product algorithm is equivalent to Pearl's
('belief propagation" algorithm for statistical inference on Bayesian networks; the Baum-Welch
or "forward-backward" algorithm for inference with hidden Markov models; and the Kalman
smoother for linear Gaussian state-space models.
However, it is important to note that the min-sum algorithm is a two-way "backward-forward"
algorithm. The VA obtains the same result with a "forward-only" algorithm by storing a path
history with each survivor. Of course, "forward-only" is a key simplification, particularly for
real-time communications; the min-sum algorithm would never have been adopted in practice
as widely as the VA has been.
1
10 Conclusion
The Viterbi algorithm has been tremendously important in communications. For moderately
complex (not capacity-approaching) codes, it has proved to yield the best tradeoff between
performance and complexity both on power-limited channels, such as space channels, and on
bandwidth-limited channels, such as voiceband telephone lines. In practice, in these regimes
it has clearly outstripped its earlier rivals, such as sequential decoding and algebraic decoding.
(However, it seems likely that it will be superseded in many of its principal communications
applications by capacity-approaching codes with iterative decoding.)
Moreover, the VA has become a general-purpose algorithm for decoding hidden Markov models
in a huge variety of applications, from speech recognition to computational biology.
Andy Viterbi clearly did not envision the full import of the VA when he first introduced it.
However, he and his colleagues at Linkabit and Qualcomm were largely responsible for making
it practical, and for driving its widespread adoption in communications. The history might have
been otherwise, but it wasn't. In actual fact, no one deserves more credit for this tremendously
important invention than its actual inventor.
1Interestingly, Ungerboeck discovered both the sum-product and the min-sum algorithms for equalization
applications in his thesis [28]; however, he missed the forward-only version.
6
Acknowledgments
I am very grateful for comments on drafts of this paper by Keith Chugg, Dan Costello, Bob
Gallager, Jim Massey, Jim Omura, Sergio Verdu and Andy Viterbi.
References
[1] S. M. Aji and R. J. McEliece, "The generalized distributive law," IEEE Trans. Inform. Theory, vol.
46, pp. 325-343, Mar. 2000.
[2] L. R. Bahl, J. Cocke, F. Jelinek and J. Raviv, "Optimal decoding of linear codes for minimizing
symbol error rate," IEEE Trans. Inform. Theory, vol. IT-20, pp. 284-287, Mar. 1974.
[3] A. R. Cohen, J. A. Heller and A. J. Viterbi, "A new coding technique for asynchronous multiple
access communication," IEEE Trans. Commun. Tech., vol. COM-19, pp. 849-855, Oct. 1971.
[4] O. M. Collins, "The subtleties and intricacies of building a constraint length 15 convolutional deÂ
coder," IEEE Trans. Commun., vol. 40, pp. 1810-1819, Dec. 1992.
[5] D. J. Costello, Jr., J. Hagenauer, H. Imai and S. B. Wicker, "Applications of error-control coding,"
IEEE Trans. Inform. Theory, vol. 44, pp. 2531-2560, Oct. 1998.
[6] G. D. Forney, Jr., "Review of random tree codes," Appendix A, Final Report, Contract NAS2-3637,
NASA CR73176, NASA Ames Res. Ctr., Moffett Field, CA, Dec. 1967.
[7] G. D. Forney, Jr., "Coding and its application in space communications," IEEE Spectrum, vol. 7,
pp. 47-58, 1970.
[8] G. D. Forney, Jr., "Maximum-likelihood sequence estimation of digital sequences in the presence of
intersymbol interference," IEEE Trans. Inform. Theory, vol. IT-18, pp. 363-378, May 1972.
[9] G. D. Forney, Jr., "The Viterbi algorithm," Proc. IEEE, vol. 61, pp. 268-278, March 1973.
[10] G. D. Forney, Jr., "Convolutional codes II. Maximum-likelihood decoding," Inform. and Control,
vol. 25, pp. 222-266, 1974.
[11] G. D. Forney, Jr., L. Brown, M. V. Eyuboglu, and J. L. Moran III, "The V.34 high-speed modem
standard," IEEE Commun. Mag., vol. 34, no. 12, pp. 28-33, Dec. 1996.
[12] R. G. Gallager, Low-Density Parity-Check Codes. Cambridge, MA: MIT Press, 1963.
[13] J. A. Heller, "Short constraint length convolutional codes," Jet Prop. Lab., Space Prog. Summary
37-54, vol. III, pp. 171-177,1968.
[14] J. A. Heller, "Improved performance of short constraint length convolutional codes," Jet Prop. Lab.,
Space Prog. Summary 37-56, vol. III, pp. 83-84, 1969.
[15] J. A. Heller and 1. M. Jacobs, "Viterbi decoding for satellite and space communication," IEEE Trans.
Commun. Tech., vol. COM-19, pp. 835-848, Oct. 1971.
[16] J. Henderson, S. Salzberg and K. H. Fasman, "Finding genes in DNA with a hidden Markov model,"
J. Comput. Biol., vol. 4, pp. 127-141, 1997.
[17] K. A. S. Immink, P. H. Siegel and J. K. Wolf, "Codes for digital recorders," IEEE Trans. Inform.
Theory, vol. 44, pp. 2260-2299, Oct. 1998.
[18] H. Kobayashi, "Application of probabilistic decoding to digital magnetic recording systems," IBM
J. Res. Dev., vol. 15, pp. pp. 64-74, Jan. 1971.
7
[19] H. Kobayashi, "Correlative level coding and maximum likelihood decoding," IEEE Trans. Inform.
Theory, vol. IT-17, pp. 586-594, Sept. 1971.
[20] F. R. Kschischang, B. J. Frey and H.-A. Loeliger, "Factor graphs and the sum-product algorithm,"
IEEE Trans. Inform. Theory, vol. 47, pp. 498-519, Feb. 2001.
[21] J. L. Massey, Threshold Decoding. Cambridge, MA: MIT Press, 1963.
[22] D. Morton, "Andrew Viterbi, electrical engineer: An oral history," IEEE History Center, Rutgers
V., New Brunswick, NJ, Oct. 1999.
[23] J. K. Omura, "On the Viterbi decoding algorithm," IEEE Trans. Inform. Theory, vol. IT-15, pp.
177-179, 1969.
[24] J. K. Omura, "Optimal receiver design for convolutional codes and channels with memory via control
theoretical concepts," Info. Sci., vol. 3, pp. 243-266, July 1971.
[25] R. Padovani, "Ten years of progress in CDMA," Viterbi Conference, Vniv. So. Calif., Los Angeles,
Mar. 2005.
[26] L. R. Rabiner, "A tutorial on hidden Markov models and selected applications in speech recognition,"
Pmc. IEEE, vol. 77, pp. 257-286, Feb. 1989.
[27] R. M. Tanner, "A recursive approach to low complexity codes," IEEE Trans. Inform. Theory, vol.
IT-27, pp. 533-547, Sept. 1981.
[28] G. Vngerboeck, "Nonlinear equalization of binary signals in Gaussian noise," IEEE Trans. Commun.
Tech., vol. COM-19, pp. 1128-1137, Dec. 1971.
[29] G. Vngerboeck, "Channel coding with multilevel/phase signals," IEEE Trans. Inform. Theory, vol.
IT-28, pp. 55-67, Jan. 1982.
[30] A. J. Viterbi, "Error bounds for convolutional codes and an asymptotically optimum decoding
algorithm," IEEE Trans. Inform. Theory, vol. IT-13, pp. 260-269, April 1967.
[31] A. J. Viterbi, "Review of Statistical Theory of Signal Detection (2nd ed.), by Carl W. Helstrom,"
IEEE Trans. Inform. Theory, vol. IT-16, p. 653, Sept. 1970.
[32] A. J. Viterbi, "Convolutional codes and their performance in communication systems," IEEE Trans.
Commun. Tech., vol. COM-19, pp. 751-772, Oct. 1971.
[33] A. J. Viterbi, "From proof to product," 1990 IEEE Communication Theory Workshop, Ojai, CA,
April 1990.
[34] N. Wiberg, "Codes and decoding on general graphs," Ph.D. dissertation, Linkoping V., Linkoping,
Sweden, 1996.
[35] N. Wiberg, H.-A. Loeliger and R. Kotter, "Codes and iterative decoding on general graphs," Eur.
Trans. Telecomm., vol. 6, pp. 513-525, Sept.fOct. 1995.
[36] J. M. Wozencraft and B. Reiffen, Sequential Decoding. Cambridge, MA: MIT Press, 1961.
[37] H. Yudkin, "Channel state testing in information decoding," Sc.D. dissertation, Dept. Elec. Engg.,
MIT, Cambridge, MA, 1964.
8
Jim Omura, "Reflections -on Andrew Viterbi and his Algorithm," presented at the Viterbi
Conference: Advancing Theory through Communications Sciences, March 8-9, 2005, at
the University of Southern California.
[The copy of this paper was added to the Viterbi Family Archives by Michael Hooks,
Viterbi Family Archivist, and was not part of the original donation by Dr. Andrew
Viterbi.]
Reflections on Andrew Viterbi
and
his Algorithm
Jim Omura
March 8, 2005
My Lucky Timing
• Finished my Ph.D work at Stanford in 1966 and working
at the Stanford Research Institute.
My Lucky Timing
• Finished my Ph.D work at Stanford in 1966 and working
at the Stanford Research Institute.
• First met Andrew Viterbi at JPL Conference in the Fall of
1966 where he presented,
A. J. Viterbi, "Error bounds for convolutional
codes and an asymptotically optimum
decoding algorithm," IEEE Trans. Inform.
Theory, vol. IT-13, pp. 260-269, Apr. 1967.
~ Random coding bound showing the potential power of convolutional
codes
~ The Algorithm was mainly used to prove this performance bound
Stanford Course on Coding
(1967)
• J. M. Wozencraft and I. M. Jacobs,
Principles of Communication Engineering,
New York, Wiley, 1965
• Started to think about maximum likelihood
decoding of convolutional codes, an
unsolved problem (according to W&J)
Stanford Course on Coding
(1967)
• J. M. Wozencraft and I. M. Jacobs,
Principles of Communication Engineering,
New York, Wiley, 1965
• Started to think about maximum likelihood
decoding of convolutional codes, an
unsolved problem (according to W&J)
• Returned to my Ph.D thesis work during
1963-1966 at Stanford
Noiseless Feedback Channels
Receiver
I
I
I
I
I
I R '
k
Channel
N
k
I I
I I
I Noiseless I
III I I-
I Feedback I
I I
Transmitter
Shannon's result that the channel capacity of a
memoryless noisy channel is not increased by noiseless
feedback is rather surprising.
Noiseless Feedback Channels
Receiver
I
I
I
I
I
IRk
Channel
N
k
I I
I I
I Noiseless I
4 I I~
I Feedback I
I I
I
Transmitter I
I
I
I
C
k
I
A. J. Viterbi, "The effects of sequential decision feedback
on communications over the Gaussian Channel," Info.
and Contr., 8, 1, Feb 1965, pp. 80-92. (Post decision
feedback)
Ph.D Thesis (1966)
Transmitter Channel Receiver
N
k
I
I
I
A
X
C
k
IRk
ZN= X
I I
I I
Zk == G(Sk)
I I
I I
Sk+l == F(Sk,R
k
)
I
Noiseless
I
k == 1,2,3,...,N
, ,
III
I
Feedback
I
I I
J. K. Omura, "Optimum linear transmission of analog
data for channels with feedback," IEEE Trans. Inform.
Theory, vol. IT-14, pp. 38-43, Jan. 1968.
Dynamic Programming Relation to
Viterbi Decoding (1968)
Receiver Channel
N
k
I Noiseless
: Feedback
I
I
I
I
X
k
I
I
I
I
4 I~------II---
I
I
I
I
Transmitter
Convolutional Encoder
• Binary input
• Finite memory
• Finite state dynamical system
Dynamic Programming Relation to
Viterbi Decoding (1968)
Receiver Channel
N
k
I
I
I
X
k
I
Transmitter
Convolutional Encoder
• Binary input
• Finite memory
• Finite state dynamical system
I found the optimum decoding
algorithm for convolutional
codes! Not!!
I found the optimum decoding
algorithm for convolutional
codes! Not!!
A. J. Viterbi, "Error bounds for
convolutional codes and an
asymptotically optimum
decoding algorithm," IEEE
Trans. Inform. Theory, vol. ITÂ
13, pp. 260-269, Apr. 1967.
I was the first to prove the optimality of
Viterbi's algorithm! Not!!
----- -_.~~~~~~---------,
I was the first to prove the optimality of
Viterbi's algorithm! Not!!
G. D. Forney, "Final report on a
coding system design for advanced
solar missions," Codex Corp.,
Watertown, Mass., Contract NAS2Â
3637, Dec. 1967, Appendix A.
Small World: Dave Forney and I were students
in an Information Theory course at MIT in
1962 taught by Irwin Jacobs.
"j
0
."}
1:
0
1 0
1
1
,~
-
- -
J. .iii. Jl. ....
s 1
Dave Forney's Trellis
(J
I helped name Viterbi's algorithm!
• A. J. Viterbi, "Error bounds for convolutional
codes and an asymptotically optimum
decoding algorithm," IEEE Trans. Inform.
Theory, vol. IT-13, pp. 260-269, Apr. 1967.
• G. D. Forney, "Final report on a coding system
design for advanced solar missions," Codex
Corp., Watertown, Mass., Contract NAS2Â
3637, Dec. 1967, Appendix A.
• J. K. Omura, "On the Viterbi decoding
algorithm," IEEE Trans. Inform. Theory
(Correspondence), vol. IT-15, pp. 177-179,
Jan. 1969.
Real Payoff: Andy offered me a
faculty position at UCLA
• Andy became my mentor for my academic
career
- Shared all of his lecture material
- Advised me on teaching and research
- Discussed many research problems
• We worked together on two projects that
were the highlights of my academic
career.
Extended Channel Coding Type
Bound to Source Coding
• J. K. Omura, "A coding theorem for
discrete-time sources," IEEE Trans.
Inform. Theory, vol. IT-19, pp. 490-498, Jul
1973.
dN(C) < D + doe-NE(R,D)
where E(R, D) > 0 for R > R(D)
Extended Source Coding Bound to
Convolutional Codes
• J. K. Omura, "A coding theorem for
discrete-time sources," IEEE Trans.
Inform. Theory, vol. IT-19, pp. 490-498, Jul
1973.
• A. J. Viterbi and J. K. Omura, "Trellis
encoding of memoryless discrete-time
sources with a fidelity criterion," IEEE
Trans. Inform. Theory, vol. IT-20, pp. 325Â
332, May 1974.
These papers showed source coding bounds that parallel the
bounds for block and convolutional codes for error correction.
What an Offer!!
• Andy had completed 3,4 of his book on
Principles of Digital Communications and
Coding
• He offered me a co-authorship if I would
help him complete this book adding
material on source coding (Shannon's
Rate Distortion Theory)
Looking back today co-authoring this book with Andy
was the high point of my 15 year academic career.
Implementation of the
Viterbi Algorithm
Implementation of the Viterbi Algorithm
provided the yardstick for how rapidly
concepts from Information Theory was
being applied to create the Information
Age we know today
Transition to the Information Age
Analog
World
lJransitiofl
,
1980
DigitaI. ht~l'j '.'1j,,",~ 2 PfOt;;QMO
World h'Wlift ltaniume PliO<~·· .....1~
Inteil~~"~tium.4~.~
tnt~t~'P~n~tl lUiPr~e~·Sf
tr ',s'stors
-o;Â¥IOI'"ft,ft'" '''.rr.o
~f'vV; OOOii>" V '_.
100,000
Information Theory
----.~ Information Age
Transitions to Applications of
Information Theory and Coding
• Andy left UCLA to spend more time at
Linkabit and joined UC San Diego faculty
• Jim Massy joined UCLA during 1978-1 980
- Jim left for ETH in Switzerland in 1981
- Short courses: Spread Spectrum Systems
and Cryptography
- Two joint patents on implementations and
applications of public key cryptography
Possible to design a 1024-bit public key cryptographic chip.
Transitions to Applications of
Information Theory and Coding
• I left UCLA to start Cylink in 1984
- Commercial cryptographic systems
- Commercial spread spectrum radios
- Co-founded with Lew Morris
- Jim Massey was a non-active founder
- Elwyn Berlekamp was the first Board Member
• Andy co-founded Qualcomm in 1985
Reflections of Andrew Viterbi and
his Algorithm
• The Viterbi Algorithm
- Invented as a proof technique (Viterbi, 1967)
- It was optimum (Forney, 1967)
- It was practical (Heller, 1969)
- My yardstick for Moore's Law (1970-1985)
• Andrew Viterbi
- Mentor, colleague, and co-author (UCLA)
- Role model for transitioning from academia to
becoming an entrepreneur (Cylink)
Paul H. Siegel, "Applications of the Viterbi Algorithm in Data Storage Technology,"
presented at the Viterbi Conference: Advancing Theory through Communications
Sciences, March 8-9, 2005, at the University of Southern California.
[The copy of this paper was added to the Viterbi Family Archives by Michael Hooks,
Viterbi Family Archivist, and was not part of the original donation by Dr. Andrew
Viterbi.]
2005 Viterbi Conference
Applications ofthe Viterbi Algorithm in
Data Storage Technology
Paul H. Siegel
Director, CMRR
Electrical and Computer Engineering
University of California, San Diego
~CMRR
3/8/05
Center for Magnetic Recording Research
1
•
•
•
•
3/8/05
Outline
Data storage trends
Recording channel technology
~ PRML
~ CodedPRML
~ Turbo equalization
Channel capacity
Concluding remarks
2005 Viterbi Conference
2
3
, ir
Write
--
Equalization
11110001001
Detector ....~~ Read ~~......
Equalization --
10010001001
2005 Viterbi Conference
Digital Recording Channel
I--~: Modulation 1-....... Precoder
Encoder
11101101
Timing Recovery
-
I
~ ~
Error
•
Correction
Modulation
-
-
Decoder
Decoder
Head +
Medium
3/8/05
Error
--I.~I Correction
Encoder
1110110
2005 Viterbi Conference
4
•
II
•
III
•
II
•
III
•
III
Magnetic Recording Process
Readback
Signal
Input signal
Magnetized
Medium
3/8/05
Areal Density Progress
HITACHI
11'1sptr~ the 'Ned
10 2000 90 80
Production Year
70 60
1
10
1st MR Head 60%CGR
10
2
l:---------------€f~~--~~-------_I_-I
35 Million X
Increase
10
6
r;------.....:..:==..:....:.....::...==.:....:==:..:.=.:~:....=..::..=.!:_=:~~~~~lQ-d'-~
10
5
t:-»--------------;::---.....:.~...;:;:...;:::;...;:..:.:;===----~~2<""'----1
'; .Superparamagnetlc
. effect
104 r---~~~~~--~~!!ei;:;:E~~~----:~----1r-------1
• HGST Disk Drive Products
• Industry Lab Demos
• HGST Disk Drl.ve.s wlAFC
A Demos wlAFC
10..
2
I;--~~_~~------J"'---------'""'----=I=~I
3/8/05
Ed Grochowski
2005 Viterbi Conference
5
HITACHI
l1'lsptre the 'Ned
A verage Price ofStorage
Average Price of Storage
• HOD • DRAM • Fla,sh ...- Pctper/Flim
100
10
1
0.1
0.01
0 ..001
a 0.0001
c.
i 1990
M
<:>
<:>
N
e
t
(p
o
DRAM/Flash
1995 2000
Year
Range of Paper/Film
F1~sh Projection
•
, 1
U
Projec:tl«l
1 II HDD
2005
Ed 'Grochowski
2010
3/8/05
2005 Viterbi Conference
6
A Disk Drive (and VA) in Every Pocket
50(12
..II,~, ~:e~harles.
;' .. Genius I..l:M!s Corl'1pJrIy
"A w. ....,: ":h . t__.....-::;; 3
,2:07 -1:26'"
10,000 songs
with album covers
Toshiba 1.8" drive
40.0 Gigabytes
(80GB on the way!)
3/8/05
2005 Viterbi Conference
7
8
10 2000
Parity + .
post-proc sSlng
NPML
E2PRML
EPRML
PRML
90
TMTR
MSN
(O,G/I)
(1,7)
80
2005 Viterbi Conference
Peak Detection
70
MFM
Signal Processing and Coding
Innovation
4--- ANALOG - ... ~... DIGITAL ~
10
6
10
5
N
104
c:
:.:::
10
3 (I)
:t:::
.c
ta.
C)
10
2
OJ
:E
~
1:0 :t::::
U)
s::
Qj
1 C
-
FM
~
10 ..
1
«
10..
2
10..
3
60
3/8/05
Key References and Their Impact...
[1] A.J. Viterbi, "Error Bounds for Convolutional Codes and an
Asymptotically Optimum Decoding Algorithm," IEEE Transactions on
Information Theory, vol. IT-13, no. 2, pp. 260-269, April 1967.
[2] A.J. Viterbi, "Convolutional Codes and Their Performance in
Communication Systems," IEEE Transactions on Communications
Technology, vol. COM-19, no. 5, pp. 751-772, October 1971.
[3] A.J. Viterbi and J. K. Omura, Principles ofDigital Communication and
Coding. New York, NY: McGraw-Hill, Inc., 1979, Ch. 4.9, pp. 272-284.
[4] A.J. Viterbi, "An Intuitive Justification and a Simplified Implementation of
the MAP Decoder for Convolutional Codes," IEEE Journal on Selected
Areas in Communications, vol. 16, no. 2, pp. 260-264, February 1998.
3/8/05
2005 Viterbi Conference
9
PRML ...
[1] "Error Bounds for Convolutional Codes and an
Asymptotically Optimum Decoding Algorithm"
~ Since the introduction of PRML technology
in 1990, the VA has been the standard
detection method in disk drives.
3/8/05
2005 Viterbi Conference
10
Coded PRML ...
[2] "Convolutional Codes and Their Performance in
Communication Systems"
[3] Principles ofDigital Communication and Coding
~ Since the mid-1990's, error event characterization
of partial-response channels has been used to
bound performance and to design constrained
modulation codes that detect and/or forbid
dominant error events.
3/8/05
2005 Viterbi Conference
11
Turbo Equalization and Channel Capacity
[4] "An Intuitive Justification and a Simplified
Implementation of the MAP Decoder for
Convolutional Codes"
> "Turbo-equalized" recording channels (proposed)
use a modified "dual-max" algorithm for
detection and a difference-metric LDPC decoder.
> Sharp estimates of the recording channel capacity
are calculated using a "generalized VA."
3/8/05
2005 Viterbi Conference
12
What is PRML?
h(D) == (1- D
2
)
== (1- D)(l +D)
o
- 1
1 -1
• "PR" = Partial Response [Class-4] Equalization
2
-2
• "ML" = Maximum Likelihood Sequence Detection (VA)
-1/0
"Dicode" trellis for even/odd interleaves
h(D)=l-D
1/2
-1/-2
1/0
*The acronym "PRML" was coined by Andre Milewski, of IBM LaGaude.
3/8/05
2005 Viterbi Conference
13
Difference Metric VA for Dicode
r -1
n
=>
=>
3/8/05
• Used in first commercial disk drive with PRML:
IBM 681 (1990)
2005 Viterbi Conference
14
Difference Metric VA for Dicode
r 2.6
DM 1.6
1
1.6
0.2
1.2
1.5
1.2
-1
o
1.3
0.3
A
Y 2 o o o -2
?
.
3/8/05
2005 Viterbi Conference
15
BeyondPRML
• Extended PRML - "ENPRML"
h(D) =(1- D)(l +D)N+l, N > 1
- Viterbi detector has 2
N
+
2
states.
- EPR4 and E2PR4 have been widely used in commercial drives.
• Noise-predictive PRML (a.k.a. Generalized PRML)
h(D) == (1- D
2
)(1 + P1D + P2D2)
t t
PR4 Noise-whitening filter
3/8/05
2005 Viterbi Conference
16
Post-Processor EPRML Detector
• "Turbo-PRML" (1993)
signal
alternate paths
Enhanced
-
PR4 Viterbi
p
Detector
,
l+D
Post-
-
-
-
processor
3/8/05
Equalized
PR4
PRML estimate and
2005 Viterbi Conference
17
Trellis-coded PRML
• Convolutional code with channel precoder
• Combined convolutional code and channel trellis detector
free
Cony.
Precoder
Dicode
-
- :-E
~
~
-
~
~
Encoder
Channel
~
~
l/(l+D)
Coset
Sequence
dB lif
d2 > free
free - dB 1 if
free +
d
H
·
free IS even
d%ee is odd
3/8/05
2005 Viterbi Conference
18
Distance-Enhancing Constrained Codes
• Characterize PR channel error-events using error-state
diagram analysis. (See [2], [3].)
• Determine modulation constraints that reduce and/or
forbid dominant error events, and design code.
• Incorporate channel and code constraints into detector
trellis, or use reduced-state trellis and a post-processor.
3/8/05
2005 Viterbi Conference
19
Error Event Analysis - E2PR4
• E2PR4: h(D) = (1- D)(1 + D)3
• Input "error" events: e(D) = Xl(D) - x
2
(D)
d 2(e)
e=x
1
-x
2
6 +-+
+-+00+-+
8 +-+-(+-)
+-+-(+-)+
3/8/05
2005 Viterbi Conference
20
Distance-Enhancing Codes
• Matched-Spectral-Null (MSN) codes
~DC-null and order-K Nyquist null on E2PR4:
dJree > 2(K + 3)
• Maximum-Transition-Run MTR(j,k) codes
~Limit number of consecutive l' s to j (k) on even (odd) phase
~For E2PR4, the MTR(2,3) constraint yields:
d:
ree
=10
• Parity-check codes
~Detect variety of error events
3/8/05
2005 Viterbi Conference
21
Combined Code-Channel Trellis
o
G
MTR(2,3) constraint graph
(NRZI format)
Combined MTR(2,3)
and E2PR4 trellis
(NRZ format)
3/8/05
2005 Viterbi Conference
22
State-of-the-Art Channel
• Rate-96/104 dual-parity code with MTR(3,3) constraints
~ Eliminates all error events of type: + - + - , + - + - + , + - + Â ~ Eliminates half of events of type: + - +
~ Detects error events of type: +, + -, + - +, and +00+
• 16-state NPML detector with dual-parity post-processing
~ Gain of O.75dB over rate-48/49, no parity, at P
e
(sector)=10-
6
3/8/05
2005 Viterbi Conference
23
3/8/05
Turbo Equalization
-
LDPC GPR
~ ..
Encoder Channel
-
BCJR-APP
-
LDPC
.
~ Detector
~
Decoder
-
extrinsic info
extrinsic info
• Length-4376 LDPC code
• Gain -4 dB over uncoded NPML at P
e
(symbol)=lO-5
• Gap to capacity -1.5dB
2005 Viterbi Conference
24
25
[4]
Finaf Nod
Trellis
Nodes
GeneraliZed VA
Backward
Recursion
2005 Viterbi Conference
Generalized
Dual-Maxima
Computation
.....
kth branoh received symbols
Yk
Trellis
Nodes
Sl S
r-----,
Generalize.d VA
Forward
Recursion
BCJR
fnnial Node
Simplified BCJR: Dual-Max Detector
L
n
= max{An_t(s')+Bn(s)+cn(s',s)}- max {~_t(s')+Bn(s)+cn(s',s)}
s',s: x=l Sf,S: x=-l
3/8/05
Capacity of Magnetic Recording Channels
• Binary input, linear lSI, additive, i.i.d. Gaussian noise
n-l
y[i] = L h[k]x[i-k] + n[i]
k=O
• Capacity C
C max I(X;Y)
p(x)
max H(Y)-H(Y I X)
p(x)
max H(Y)-!log(JreN
o
)
p(x) 1 2
IFor a given P(X), we want to compute H(Y) I
3/8/05
2005 Viterbi Conference
26
Computing Entropy Rates
• Shannon-McMillan-Breimann theorem implies
as n -7 00 ,where y; is a single long sample realization of
the channel output process.
• The probability p(yin) can be computed using the forward
recursion of the BCJR - APP algorithm.
• In the log domain, this forward recursion can be interpreted as
a "generalized Viterbi algorithm." (See [4].)
3/8/05
2005 Viterbi Conference
27
Capacity Bounds for Dicode h(D)=1-D
3/8/05
O.7i " '-.
- -..- No lSI
.. .O'icod.e SIR
0.6: ·e- Dicod:e 1C =1
-3
2005 Viterbi Conference
28
Concluding Remarks
• The Viterbi Algorithm and related ML performance evaluation
techniques have been vital to the advancement of data storage
technology - magnetic and optical- since 1990.
• The "Viterbi architecture" for APP computation has influenced
the development and evaluation of capacity-approaching
coding schemes for digital recording applications.
• Future storage technologies offer interesting challenges in
detection and decoding...
3/8/05
2005 Viterbi Conference
29
Holographic Recording
3/8/05
2005 Viterbi Conference
2-D Intersymbol
Interference
0.05 0.21 0.05
h 0.20 0.91 0.19
0.01 0.10 0.02
30
Two-Dimensional Optical Storage (TwoDOS)
(spiraJ contains 11 bit-rows)
2-D Impulse response
3/8/05
Courtesy of Wim Coene, Philips Research
2005 Viterbi Conference
31
And, finally ...
• Congratulations - and many thanks - Andy!!
on the occasion of your milestone birthday, and for
your many landmark contributions to science,
technology, and engineering education.
-1/0
1/2
-1/-2
1/0
3/8/05
2005 Viterbi Conference
32
PRML References
• H. Kobayashi and D.T. Tang, "Application of partial-response
channel coding to magnetic recording systems," IBM J. Res.
Develop., voL 14, pp. 368-375, July 1970.
• H. Kobayashi, "Application of probabilistic decoding to digital
magnetic recording systems," IBM J. Res. Develop., voL 15, pp. 65Â
74, Jan. 1971.
• H. Kobayashi, "Correlative level coding and maximum-likelihood
decoding," IEEE Trans. Inform. Theory, voL IT-17, pp. 586-594,
Sept. 1971.
• G.D. Forney, Jr., "Maximum likelihood sequence detection in the
presence of intersymbol interference," IEEE Trans. Inform. Theory,
voL IT-18, pp. 363-378, May 1972.
• R.D.Cideciyan, et aL, "A PRML System for Digital Magnetic
Recording," IEEE J. Select. Areas Commun., voL 10, no. 1, pp. 38Â
56, Jan. 1992.
3/8/05
2005 Viterbi Conference
33
EPRML References
• H.K. Thapar and A.M. Patel, "A class of partial response systems for
increasing storage density in magnetic recording," IEEE Trans. Magn.,
pp. 3666-3678, Sept. 1987.
• G. Fettweis, R. Karabed, P. H. Siegel, and H. K. Thapar, "ReducedÂ
complexity Viterbi detector architectures for partial response signaling," in
Proc. 1995 Global Telecommun. Conf. (Globecom'95), Singapore,
pp. 559-563.
• R.Wood, "Turbo-PRML: A compromise EPRML detector," IEEE Trans.
Magn., vol. 29, pp. 4018-4020, Nov. 1993.
• K. K. Fitzpatrick, "A reduced complexity EPR4 post-processor," IEEE Trans.
Magn., vol. 34, pp. 135-140, Jan. 1998.
• J. D. Coker, E. Eleftheriou, R. L. Galbraith, and W. Hirt, "Noise-predictive
maximum likelihood (NPML) detection," IEEE Trans. Magn., pt. 1, vol. 34,
pp. 110-117, Jan. 1998.
3/8/05
2005 Viterbi Conference
34
Coded PRML References
• J. K. Wolf and G. Ungerboeck, "Trellis coding for partial-response channels,"
IEEE Trans. Commun., vol. COM-34, no. 8, pp. 765-773, Aug. 1986.
• R. Karabed and P. Siegel, "Matched spectral-null codes for partial response
channels," IEEE Trans. Inform. Theory, vol. 37, no. 3, pp. 818-855, May
1991.
• J. Moon and B. Brickner, "Maximum transition run codes for data storage
systems," IEEE Trans. Magn., vol. 32, pp. 3992-3994, Sept. 1996.
• W. Bliss, "An 8/9 rate time-varying trellis code for high density magnetic
recording," IEEE Trans. Magn., vol. 33, pp. 2746-2748, Sept. 1997.
• S.A. Altekar, M. Berggren, B.E. Moision, P.R. Siegel, J.K. Wolf, "Error
event characterization on partial-response channels", IEEE Trans. Inform.
Theory, vol. 45, no. 1 , pp. 241 -247, Jan. 1999.
3/8/05
2005 Viterbi Conference
35
Coded PRML References (cont.)
• R. Karabed, P.R. Siegel, and E. Soljanin, "Constrained coding for binary
channels with high intersymbol interference," IEEE Trans. Inform. Theory,
vol. 45, no. 5,pp. 1777-1797, Sept. 1999.
• T. Conway, "A new target response with parity coding for high density
magnetic recording channels," IEEE Trans. Magn., vol. 34, no. 4,
pp. 2382-2486, July 1998.
• Cideciyan R.D., Coker, J.D., Eleftheriou, E., and Galbraith, R.L.: "Noise
predictive maximum likelihood detection combined with parity-based postÂ
processing", IEEE Trans. Magn., vol. 37, no. 2, pp.714-720, March 2001.
• R.D. Cideciyan, E. Eleftheriou, B.R. Marcus, and D. S. Modha, "Maximum
transition run codes for generalized partial response channels," IEEE J. Select.
Areas Commun., vol. 19, no. 4, pp. 619-634, April 2001.
• R.D. Cideciyan and E. Eleftheriou, "Codes satisfying maximum transition run
and parity-check constraints, Proc. IEEE Int. Con! Commun., vol. 27, no. 1,
June 2004, pp. 635 - 639.
3/8/05
2005 Viterbi Conference
36
Turbo Equalization References
• L. R. Bahl, J. Cocke, F. Jelinek, and J. Raviv, "Optimal decoding of linear
codes for minimizing symbol error rate," IEEE Trans. Inform. Theory,
vol. IT-20, pp. 284-287, Sep 1974.
• W. Ryan, "Performance of high rate turbo codes on a PR4-equalized
magnetic recording channel," Proc. 1998 Int. Con! Commun., vol. 2,
June 1998, pp. 947-951.
• T. Souvignier, A. Friedmann, M. Oberg, P. H. Siegel, R. E. Swanson, and J.
K. Wolf, "Turbo decoding for PR4: parallel versus serial concatenation,"
Proc. IEEE ICC'99, Vancouver, Canada, June 1999, pp. 1638-1642.
• B. M. Kurkoski, P. H. Siegel, J. K. Wolf, "Joint Message-Passing Decoding
of LDPC Codes and Partial-Response Channels," IEEE Trans. Inform.
Theory, vol. 48, no. 6, pp. 1410-1422, June 2002.
3/8/05
2005 Viterbi Conference
37
Capacity Calculation References
• D. Arnold and H.-A. Loeliger, "On the information rate of binary-input
channels with memory," Proc. IEEE ICC 2001, (Helsinki, Finland), June
2001,pp.2692-2695.
• H. D. Pfister, J. B. Soriaga, and P. H. Siegel, "On the achievable information
rates of finite state lSI channels," Proc. IEEE GLOBECOM 2001, (San
Antonio, Texas), Nov. 2001, pp. 2992-2996.
• A. Kavcic, "On the capacity of Markov sources over noisy channels," Proc.
IEEE GLOBECOM 2001, (San Antonio, Texas), Nov. 2001, pp. 2997-3001.
• P. Vontobel and D. M. Arnold, "An upper bound on the capacity of channels
with memory and constraint input," Proc. IEEE Inform. Theory Workshop,
(Cairns, Australia), Sept. 2001.
• S. Yang and A. Kavcic, "Capacity of Partial Response Channels," Handbook
on Coding and Signal Processing for Recording Systems, CRC Press 2004,
Ch. 13.
3/8/05
2005 Viterbi Conference
38
Robert J. McEliece, "Viterbi' s Impact on the Exploration of the Solar System,"
presented at the Viterbi Conference: Advancing Theory through Communications
Sciences, March 8-9, 2005, at the University of Southern California.
[The copy of this paper was added to the Viterbi Family Archives by Michael Hooks,
Viterbi Family Archivist, and was not part of the original donation by Dr. Andrew
Viterbi.]
Error Bounds for Convolutional Codes
and an Asymptotically. Optimum
Decoding Algorithm
IV. A PROBABILISTIC NONSEQUENTIAL DECODING
ALGORITHM
We now describe a new' probabilistic nonsequential
decoding algorithm which, as we shall show in the
next section, is asymptotically optimum for rates R >
R
o
= E
o
(l). The algorithm decodes an L-branch tree
by performing L repetitions of one basic step. We adopt
the convention of denoting each branch of a given path
by its data symbol ai, an element of GF(q). Also, although
GF(q) is isomorphic to the integers modulo q only when
q is a prime, for the sake of compact notation, we shall
use the integer r to denote the rth element of the field.
In Step 1 the decoder considers all qK paths for the
first K branches (where K is the branch constraint length
of the code) and computes all qK likelihood functions
IIf=l P(Yi I a.). The decoder then compares the likelihood
function for the q paths:
(0, a2, as, ag),
(1, a2, aa, aK),
(q - 1, a2, as, ... aK)
for each of the qK-l possible vectors (a2' ag ••• aK).
It thus perfonns qK-l comparisons each among q path
likelihood functions. Let the path corresponding to the
. od function in each comparison be denoted
; .the qK-l survivors of as many comÂ
eserved for further consideration; the
remaining paths are discarded. Among the qK-l survivors
SUMMARy.···....
! ;. <
SHANNON· I hfl d' ,
.'. UMIT eSS.rng.er. oj In e V· ,
t , oyager
-1.59 dB ,MER MGS
~~~ MRD Cassini
1. ~
-1.0
1.0 3.0 5.0 7.0
Eb/No,dB
9.0 11.0
I '
SUMMARY
..1
-2
SHANNON assenger athflnde~
LIMIT . Voyager
-1.59 dB MER MGS
e-..-..~ MR Cassini
- Uncoded
- (32,6) Biorthogonal
Mariner 4 - (7, 1/2) Conv.Code
(7, 1/2) + (255,223) RS
- (15,1/6) + (255,223) RS
- (8920, 1/6) Turbo
-1.0
1.0 3.0 5.0 7.0
Eb/N
o
,dB
9.0 11.0
r--------~------- ~------- --~-
Abstract (if available)
Linked assets
Viterbi: Presentations
Conceptually similar
PDF
Paul H. Siegel, "Applications of the Viterbi Algorithm in Data Storage Technology," [March 8, 2005].
PDF
G. David Forney, Jr., "The Viterbi Algorithm: A Personal History," [March 8, 2005].
PDF
Robert J. McEliece, "Viterbi's Impact on the Exploration of the Solar System," [March 8, 2005].
PDF
Viterbi Conference: Advancing Technology through Communications Sciences, March 8-9, 2004[2005], Schedule of Events.
PDF
William W. Wu, "Andrew Viterbi's Impact on International Satellite Communications," March 8, 2005.
PDF
2005 Viterbi Conference & Lecture, Sponsored by QUAKLCOMM.
PDF
Email, Jim Mazo to Andrew J. Viterbi, March 8, 1994
PDF
Robert J. McEliece, "Learning to Teach Viterbi's Algorithm," March 22, 2007.
PDF
Andrew J. Viterbi, "The Viterbi Algorithm: Evolution and Applications."
PDF
[Stuart C. Schwartz], "Andrew Viterbi before the Viterbi Algorithm, (A Personal Thank You), [April 21, 2005].
PDF
Jacob Ziv, "What is hidden in an Individual Sequence?" March 8, 2005.
PDF
Andrew J. Viterbi, "Broadband Wireless: Promise and Reality," October 2005.
PDF
Andrew J. Viterbi, "The LOG-MAP Algorithm Simplified: Theory and Implementation."
PDF
Rajesh K. Pankaj and Andrew J. Viterbi, "Two Multi Access Algorithms for Channels with Multiple Reception Capacity."
PDF
[Eberhardt Rechtin], Note to Walt [Victor] and Andy [Andrew J. Viterbi].
PDF
Andrew J. Viterbi, "Retrospective through Publications," April 21, 2005.
PDF
Letter, Andrew J. Viterbi to Jerry D. Gibbon, March 2, 1995
PDF
Letter, Leonard M. Silverman to Andrew J. Viterbi, March 22, 1995
PDF
Letter, George Gilder to Andrew J. Viterbi, March 3, 1997
PDF
John G Proakis, "Application of the Viterbi Algorithm to Communication Channel Equalization," April 21, 2005.
Asset Metadata
Core Title
Jim Omura, "Reflections on Andrew Viterbi and his Algorithm," March 8, 2005.
Publisher
University of Southern California. Libraries
(digital)
Tag
OAI-PMH Harvest
Permanent Link (DOI)
https://doi.org/10.25549/vit-m3979
Unique identifier
UC1961296
Identifier
VIT-001562 (filename),USC, Andrew and Erna Viterbi School of Engineering, Viterbi Conference (folder),Box 34, Folder 600 (identifying number),vit-m1 (legacy collection record id),vit-m3979 (legacy record id),vit-c117-2800 (legacy record id)
Legacy Identifier
VIT-001562-2.pdf
Dmrecord
2800
Rights
There are materials within the archives that are marked confidential or proprietary, or that contain information that is obviously confidential. Examples of the latter include letters of references and recommendations for employment, promotions, and awards; nominations for awards and honors; resumes of colleagues of Dr. Viterbi; and grade reports of students in Dr. Viterbi's classes at the University of California, Los Angeles, and the University of California, San Diego.; These restricted items were not scanned and, therefore, are not included in the USC Digital Archive.; Researchers wishing to see any of the restricted materials should consult with the USC Libraries Special Collections staff.
Source
Andrew J. and Erna Viterbi Family Archives
(collection),
University of Southern California
(contributing entity),
Viterbi: Presentations
(subcollection)
Access Conditions
There are materials within the archives that are marked confidential or proprietary, or that contain information that is obviously confidential. Examples of the latter include letters of referenc...
Repository Name
USC Libraries Special Collections
Repository Location
Doheny Memorial Library 206, 3550 Trousdale Parkway, Los Angeles, California,90089-0189, 213-740-4035, specol@usc.edu
Inherited Values
Title
Viterbi: Presentations