MIL- STD-883F 2004 TEST METHOD STANDARD MICROCIRCUITS - 第672页
MIL-STD-883F METHOD 5010.4 18 June 2004 22 APPENDIX I 30.2 Desi gn proces s . a. W ho does and who appr oves the various levels of des ign? Requireme nts def init ion? Detail funct ion defi nition? Det ail des ign (e. g.…

MIL-STD-883F
METHOD 5010.4
18 June 2004
21
APPENDIX I
The intent is to get a representative cross section of cell types (i.e., combinational, sequential, input, output). Chains shall
be formed (when necessary to avoid rise and fall time measurement problems) and actual performance data over the full
operating range shall be taken (a provision to extract for multiplexing and I/O buffer delay shall be included). Delay versus
metal wire length and fanout for the above cells shall be determined. The actual performance data shall be submitted to the
qualifying activity along with computer program simulation results. The actual performance data must be within the limits
predicted by the simulation. If multipliers are used to extrapolate performance at the temperature extremes, such multipliers
shall be verified as well.
In addition, for the above cells, a set of pins shall be provided on the test chip for observability. This will enable verification
of functionality of the cells. (Note: Inputs and outputs may be multiplexed).
10.6 CAD routing and post routing simulation
. A chip or set of chips shall be submitted for approval and used to qualify
the manufacturer's ability to perform routing and to accurately predict post routing performance. The manufacturer must
submit to the qualifying activity:
a. The actual measured performance data for each function over temperature and voltage.
b. The computer simulation performance prediction.
The two results will remain on file and the actual measured performances must fall between the simulation extremes.
20. APPLICABLE DOCUMENTS (This section is not applicable to this document.)
30. CERTIFICATION QUESTIONS
30.1 Cell libraries
.
a. Who is the source for your cell libraries?
Own organization?
Work station vendors?
Outside commercial vendors?
Universities?
b. What verification or certification is done for cell libraries, including those obtained from outside organizations? Are
macrocells implemented in silicon and verified for functionality and performance limits via actual hardware test? Is
only software simulation performed?
c. How are cell libraries controlled (e.g., level of documentation, maintenance and revisions, specifications,
additions)?
d. Provide company-approved cell library.
e. Identify those implemented and tested in silicon.
f. Is a designer allowed to tailor a macrocell or "roll his own" for a certain application? If so, how is the resulting
macro tested to insure there are no problems?

MIL-STD-883F
METHOD 5010.4
18 June 2004
22
APPENDIX I
30.2 Design process
.
a. Who does and who approves the various levels of design?
Requirements definition? Detail function definition? Detail design (e.g., gate level design)? Layout and mask
generation?
b. What automatic aids are used for refinement from each design level to the next?
c. What automatic aids are used for verifying the refinement at each level (e.g., automatic checking of layout versus
schematic)?
d. How is automatic placement and routing software verified and certified for use?
30.3 Simulation
.
a. What simulators are used for:
Process simulation (e.g., SUPREME-II)?
Circuit simulation (e.g., SPICE, SCEPTRE)?
Gate level simulation (e.g., LASAR HITS)?
Switch level simulation?
Behavior/function simulation?
Dynamic timing analysis (to include actual delays due to placement and routing?
b. How are the above simulators verified? Are benchmarks used, and if so, what are these benchmarks?
c. Are the simulation results periodically checked against actual silicon test data (to complete the loop)?
30.4. Test
.
a. What test tools are used for:
Automatic test vector generation?
Fault simulation?
Insertion of design-for-testability/built-in-test features? (And are they integrated with the design process?)
b. Who is responsible for test generation:
Foundry?
Customer?
Designer?
c. If test vectors are not generated by the foundry, are the submitted vectors evaluated by the foundry to determine
the percentage of faults detected?

MIL-STD-883F
METHOD 5010.4
18 June 2004
23
APPENDIX I
30.5. Design rule checking
.
a. Are design constraints enforced by the customers or management, such as:
Synchronous designs only?
Use of an approved set of cells/macrocells only?
Conservative use of electrical and switching limits?
Is the designer able to obtain waivers?
b. What design rule checkers (DRCs) are used for:
Physical rule checks (e.g., minimum spacing)?
Electrical rule checks (e.g., max current density, fanout restrictions)? Timing rule checks (e.g., worst-case timing
paths)? Logical rule checks (e.g., unclocked feedback paths)?
c. Is each design subjected to the above DRCs?
d. How can the DRC software be shown to "work as advertised?"
e. If "correct by construction" techniques are used, what procedure is used, how is "correctness" assured?
30.6. Software control
.
a. What are the sources of design and test software?
Own organization?
Workstation vendors?
Outside commercial vendors?
Universities?
b. How is design and test software approved and controlled:
Frequency of major/minor revision?
Trouble reports?
Regression testing?
c. What commercial CAD/CAE work stations or packages are used (e.g., MENTOR, Daisy, Silvar-Lisco)? Are
modifications to any of the software packages permitted?