Skip to main content

Extreme Statistics in Nanoscale Memory Design

  • Book
  • © 2010

Overview

  • Includes a treatment of memory design from the perspective of statistical analysis
  • Covers relevant theoretical background from other fields: statistics, machine learning, optimization, reliability
  • Explains the problem of estimating statistics of memory performance variation
  • Shows solutions recently proposed in the Electronic Design Automation (EDA) community
  • Contains chapters contributed from both industry and academia
  • Includes supplementary material: sn.pub/extras

Part of the book series: Integrated Circuits and Systems (ICIR)

This is a preview of subscription content, log in via an institution to check access.

Access this book

eBook USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Other ways to access

Licence this eBook for your library

Institutional subscriptions

Table of contents (8 chapters)

Keywords

About this book

Knowledge exists: you only have to ?nd it VLSI design has come to an important in?ection point with the appearance of large manufacturing variations as semiconductor technology has moved to 45 nm feature sizes and below. If we ignore the random variations in the manufacturing process, simulation-based design essentially becomes useless, since its predictions will be far from the reality of manufactured ICs. On the other hand, using design margins based on some traditional notion of worst-case scenarios can force us to sacri?ce too much in terms of power consumption or manufacturing cost, to the extent of making the design goals even infeasible. We absolutely need to explicitly account for the statistics of this random variability, to have design margins that are accurate so that we can ?nd the optimum balance between yield loss and design cost. This discontinuity in design processes has led many researchers to develop effective methods of statistical design, where the designer can simulate not just the behavior of the nominal design, but the expected statistics of the behavior in manufactured ICs. Memory circuits tend to be the hardest hit by the problem of these random variations because of their high replication count on any single chip, which demands a very high statistical quality from the product. Requirements of 5–6s (0.

Editors and Affiliations

  • T.J. Watson Research Center, IBM Corporation, Yorktown Heights, USA

    Amith Singhee

  • Department of Computer Science, University of Illinois at Urbana-Champai, Urbana, USA

    Rob A. Rutenbar

Bibliographic Information

  • Book Title: Extreme Statistics in Nanoscale Memory Design

  • Editors: Amith Singhee, Rob A. Rutenbar

  • Series Title: Integrated Circuits and Systems

  • DOI: https://doi.org/10.1007/978-1-4419-6606-3

  • Publisher: Springer New York, NY

  • eBook Packages: Engineering, Engineering (R0)

  • Copyright Information: Springer Science+Business Media, LLC 2010

  • Hardcover ISBN: 978-1-4419-6605-6Published: 17 September 2010

  • Softcover ISBN: 978-1-4614-2672-1Published: 05 November 2012

  • eBook ISBN: 978-1-4419-6606-3Published: 09 September 2010

  • Series ISSN: 1558-9412

  • Series E-ISSN: 1558-9420

  • Edition Number: 1

  • Number of Pages: X, 246

  • Topics: Circuits and Systems, Electronics and Microelectronics, Instrumentation

Publish with us