MIL- STD-883F 2004 TEST METHOD STANDARD MICROCIRCUITS - 第671页

MIL-STD-883F METHOD 5010.4 18 June 2004 21 APPENDIX I The intent is t o get a repr esentat ive cr oss sect ion of cel l types (i.e., combinat ional, sequenti al, i nput, output ). Chai ns s ha ll be formed ( when nec ess…

100%1 / 708
MIL-STD-883F
METHOD 5010.4
18 June 2004
20
APPENDIX I
10.3 Layout verification
. The manufacturer shall retain the results of full mask level design rule checks, electrical rule
checks, and connectivity checks (see 10.1) for each application specific design. Rule checking will encompass the rules set
provided under 10.1 herein. The manufacturer will explain any rules not checked and all error reports produced by the
checker. The LVS checker will ensure that the layout matches exactly the schematic simulated by the ASIC designer. Final
layout verification results will not be required if the manufacturer's design methodology is "correct by construction." In this
case, the manufacturer will explain the methodology and rules used, as well as any rules not checked and all error reports
which were not corrected during construction of design.
10.4 Power routing simulation
. To be retained by manufacturer; derived from each final application specific electrical
design and layout. The worst case simulation of power buses shall show that at no time shall the localized bus current
density exceed specification for allowable current density of the power bus material. In addition, at no point in the power bus
shall voltage levels exceed design goals for IR drop values from the respective supply. Power routing simulation must be
based upon actual placement of cells within the array. Such a simulation may be driven by Monte Carlo methods, or in
conjunction with a digital simulator using the selected set of test vectors.
10.5 Cell design and simulation qualification
. Cell design and simulation qualification shall be accomplished in a two step
procedure consisting of:
a. Parameter verification/simulation verification, and
b. Functional verification.
A chip or set of chips, called the cell test chip set, shall be designed to provide access to a set of cells to test performance
characteristics. The cell test chip set design must be submitted to the qualifying activity for approval prior to use. The cell
test chip shall include as a minimum:
Description
Inverter
4-input NAND
2-input AND into 3-input NOR
D latch with active low reset
JK flip-flop with active low reset
TTL input buffer
CMOS input buffer
Output buffer
Three-state I/O buffer with pull-up
MIL-STD-883F
METHOD 5010.4
18 June 2004
21
APPENDIX I
The intent is to get a representative cross section of cell types (i.e., combinational, sequential, input, output). Chains shall
be formed (when necessary to avoid rise and fall time measurement problems) and actual performance data over the full
operating range shall be taken (a provision to extract for multiplexing and I/O buffer delay shall be included). Delay versus
metal wire length and fanout for the above cells shall be determined. The actual performance data shall be submitted to the
qualifying activity along with computer program simulation results. The actual performance data must be within the limits
predicted by the simulation. If multipliers are used to extrapolate performance at the temperature extremes, such multipliers
shall be verified as well.
In addition, for the above cells, a set of pins shall be provided on the test chip for observability. This will enable verification
of functionality of the cells. (Note: Inputs and outputs may be multiplexed).
10.6 CAD routing and post routing simulation
. A chip or set of chips shall be submitted for approval and used to qualify
the manufacturer's ability to perform routing and to accurately predict post routing performance. The manufacturer must
submit to the qualifying activity:
a. The actual measured performance data for each function over temperature and voltage.
b. The computer simulation performance prediction.
The two results will remain on file and the actual measured performances must fall between the simulation extremes.
20. APPLICABLE DOCUMENTS (This section is not applicable to this document.)
30. CERTIFICATION QUESTIONS
30.1 Cell libraries
.
a. Who is the source for your cell libraries?
Own organization?
Work station vendors?
Outside commercial vendors?
Universities?
b. What verification or certification is done for cell libraries, including those obtained from outside organizations? Are
macrocells implemented in silicon and verified for functionality and performance limits via actual hardware test? Is
only software simulation performed?
c. How are cell libraries controlled (e.g., level of documentation, maintenance and revisions, specifications,
additions)?
d. Provide company-approved cell library.
e. Identify those implemented and tested in silicon.
f. Is a designer allowed to tailor a macrocell or "roll his own" for a certain application? If so, how is the resulting
macro tested to insure there are no problems?
MIL-STD-883F
METHOD 5010.4
18 June 2004
22
APPENDIX I
30.2 Design process
.
a. Who does and who approves the various levels of design?
Requirements definition? Detail function definition? Detail design (e.g., gate level design)? Layout and mask
generation?
b. What automatic aids are used for refinement from each design level to the next?
c. What automatic aids are used for verifying the refinement at each level (e.g., automatic checking of layout versus
schematic)?
d. How is automatic placement and routing software verified and certified for use?
30.3 Simulation
.
a. What simulators are used for:
Process simulation (e.g., SUPREME-II)?
Circuit simulation (e.g., SPICE, SCEPTRE)?
Gate level simulation (e.g., LASAR HITS)?
Switch level simulation?
Behavior/function simulation?
Dynamic timing analysis (to include actual delays due to placement and routing?
b. How are the above simulators verified? Are benchmarks used, and if so, what are these benchmarks?
c. Are the simulation results periodically checked against actual silicon test data (to complete the loop)?
30.4. Test
.
a. What test tools are used for:
Automatic test vector generation?
Fault simulation?
Insertion of design-for-testability/built-in-test features? (And are they integrated with the design process?)
b. Who is responsible for test generation:
Foundry?
Customer?
Designer?
c. If test vectors are not generated by the foundry, are the submitted vectors evaluated by the foundry to determine
the percentage of faults detected?