Computer aided tolerancing (CAT) covers a wide range of subjects including specification and standardization, tolerancing in design/manufacturing process/product life management, verification and metrology, and functional tolerancing.

Dimensioning and tolerancing standards originated about 77 years ago in the form of various national and company standards that governed engineering drafting and documentation practices. These standards have evolved and their rapid development has brought about many significant changes to tolerancing in design and manufacturing. The release of ISO 14405-1:2010 (ISO, 2010) has introduced a rich new set of size specification modifiers, which includes two-point and spherical local sizes least squares, maximum inscribed and minimum circumscribed associations, and calculated diameters. Morse et al. (2012) present “size” as a fundamental engineering notion from several viewpoints, trace its evolution in engineering drawings, and discuss the implications of the use of size modifiers.

Many researchers have devoted their efforts to tolerancing modeling. Davidson et al. (2002) develop a tolerance maps (T-Maps) (Patent No. 6963824) model that is a hypothetical Euclidean volume of points, the shape, size, and internal subsets of which represent all possible variations in size, position, form, and orientation of a target feature. Jiang et al. (2014) describe the use of T-Maps and manufacturing maps (M-maps) to establish analytical relationships among all relevant design and machining tolerances for the transfer of cylindrical data. Clément et al. (1991) introduce a small displacement torsor (SDT) model using six small displacements to represent the position and orientation of an ideal surface in relation to another ideal surface in a kinematic way. Giordano et al. (2007) apply deviation domains to axi-symmetric cases and thus reduce the space to three dimensions at the maximum instead of six in the general case. Desrochers et al. (2003) put forward a unified Jacobian-Torsor model which combines the advantages of the torsor model and the Jacobian matrix. Ghie et al. (2010) describe how the same set of interval-based deterministic equations can be used in a statistical context. Anwer et al. (2013) investigate the fundamentals of the skin model at a conceptual, geometric, and computational level and present representation and simulation issues for product design. In another paper (Anwer et al., 2014), they investigate the concept of skin model shapes that has been developed to address digital representation of “non-ideal” parts and extended to mechanical assemblies. This concept is an interesting solution for tolerance analysis in the same way of finite element analysis, inspection analysis, and other analysis in mechanical engineering based on discrete geometry.

Tolerance analysis, as an essential element in industry, carries considerable weight in concurrent engineering, and represents the best way to solve problems in order to ensure higher quality and lower costs. Dantan et al. (2013) deal with tolerance analysis formulation, more particularly, with the uncertainty that must be taken into account in the foundation of this formulation. Walter et al. (2013) consider an extension of the existing “integrated tolerance analysis of systems in motion” approach. Mansuy et al. (2011) present an original method that enables us to develop specifications based on standards and calculate tolerances for the case of serial assembly (stacking) without clearances. This method is based on the use of influence coefficients to obtain the relationship between the functional tolerance and tolerances associated with the geometry of the mechanism’s interface surfaces. Qureshi et al. (2012) propose a statistical tolerance analysis approach for an over-constrained mechanism based on optimization and Monte Carlo simulation. Bruyère et al. (2007) propose an approach for analyzing tolerances that includes a vectorial dimensioning and tolerancing model. This allows gear conventional tolerancing practice and geometric tolerancing practice, a digital simulation based on tooth contact analysis and Monte Carlo simulation. Gao et al. (1998) introduce a direct linearization method (DLM) based on the first order Taylor series expansion of vector-loop-based assembly models which use vectors to represent either component or assembly dimensions. Worst-case tolerance analysis gives results that are overly pessimistic, resulting in the increasing cost of products. Statistical tolerance analysis takes the statistical behavior of manufacturing variations into consideration. Statistical tolerancing is a more practical and economical way of looking at tolerances and works on setting the tolerances so as to ensure a desired yield. By permitting a small fraction of assemblies to not assemble or function as required, an increase in tolerance for individual dimension may be obtained, and in turn manufacturing costs may be reduced significantly (Nigam and Turner, 1995). Statistical tolerancing methods or approaches include the root-sum-squares (RSS) method (Bender, 1968), system moments (Evans, 1975a; 1975b), quadrature (Evans, 1971; 1972), the reliability index (Parkinson, 1982; Lee and Woo, 1990), the Taguchi method (Taguchi, 1978; D’Errico and Zaino, 1988), and Monte Carlo simulations (Bruyère et al., 2007; Dantan and Qureshi, 2009; Wu et al., 2009; Qureshi et al., 2012).

Quality control, verification, and metrology have been the focus of manufacturing enterprises. In the scope of quality control, accurate evaluation of measurement uncertainties is a real challenge in improving the use of coordinate measuring machines (CMM). Ballu and Mathieu (1996) propose a univocal expression of functional and geometrical tolerances for design, manufacturing, and inspection. This language has been retained by ISO for the future geometrical product specification (GPS) standards ISO 17450-1:2005 (ISO, 2005). Sprauel et al. (2003) describe a new method, based on a statistical approach to the problem, to deduce instantaneous measurement uncertainties directly from the set of acquired coordinates. Krämer and Weckenmann (2010) describe the fusion of multi-energy stacks to measure objects of high aspect ratios and parts consisting of different absorbing materials. Savio et al. (2002) compare two different experimental methods for establishing the traceability of freeform measurements on coordinate measuring machines: (i) uncertainty assessment using nodular freeform gauges, and (ii) uncertainty assessment using uncalibrated objects. They demonstrate the feasibility of the two approaches for freeform geometries through the calibration of a turbine blade. Moroni and Petrò (2014) propose a model for evaluation of the overall inspection cost based on uncertainty evaluation, and propose two methodologies for evaluating the uncertainty.

Functional tolerancing has now been well accepted by industry and become a major field of interest for academia. Mcadams (2003) develops tolerance design principles through a careful study of the literature, observation of commonly recurring tolerance solutions, and design strategies implied by the existing tolerance design literature. These principles provide a focus for developing new methodologies that will have high impact on engineering practice. Islam (2004) describes the development of a prototype software package for solving functional dimensioning and tolerancing (FD&T) problems in a concurrent engineering environment. Hunter et al. (2008) use a functional tolerance model providing a complete framework to define the geometric dimensioning and tolerancing and its relationship with the part geometry and the inspection process in order to establish a connection between a computer aided design and computer aided inspection system. Yang et al. (2013) make a brief comparison of existing 3D functional tolerance analysis models, and propose a statistical tolerancing approach based on variation of point-set. Cao et al. (2013) propose a scheme for functional specification in accordance with the new generation of geometrical product specifications. To study the functional tolerance specification methods consistent with the new generation of GPS, Yang et al. (2010) study the class of positioning joints of part assembling and the principle of determining its priority on the basis of definition of invariance class of GPS. In this paper, a new functional tolerancing method from geometrical functional requirement to geometrical specification is presented. Etienne et al. (2008) propose an approach in order to allocate the functional tolerances that provide the best ratio between functional performances and manufacturing cost.

The CIRP (International Academy for Production Engineering) Conference on Computer Aided Tolerancing (CAT) is initiated and supported scientifically every two years by two CIRP Scientific Technical Committees (STCs): Design (STC Dn) and Precision Metrology (STC P) to address the emerging problems of CAT, which has a prominent role at the interface between product design and manufacturing. The 13th CIRP CAT Conference held at Zhejiang University, Hangzhou, China during May 11–14, 2014 was the successor to the twelve earlier conferences held in Israel (1989), the USA (1991; 2003; 2005), France (1993; 2001; 2009), Japan (1995), Canada (1997), the Netherlands (1999), Germany (2007), and the UK (2012).

Note that the evolution of CAT involves far more than the results or research mentioned above and benefits from the efforts of people from many different cultures and backgrounds. We are pleased to publish in this special part issue a selection of six papers that were presented at the conference in Hangzhou. These papers cover a wide spectrum of current international research in CAT.

For purposes of automating the assignment of tolerances during design, a math model, called the T-Map, has been produced for most of the tolerance classes that are used by designers. Like deviation domains, T-Map is a hypothetical Euclidean volume which represents all possible deviations in size, orientation and position of a feature. The paper titled “Tolerance-Maps for line-profiles constructed from Boolean intersection of T-Map primitives for arc-segments” proposes a method to produce a T-Map for the complete line profile of any shape. The method firstly decomposes a profile into segments, then creates a solid-model T-Map primitive for each, and finally combines these by Boolean intersection to generate a T-Map of the profile.

Tolerance analysis is attracting increasing attention from different disciplines. In the paper titled “An iterative statistical tolerance analysis procedure to deal with linearized behavior models” analyzes the impact of a linearization strategy on the probability of failure estimation, and proposes an iterative procedure for the assembly requirement in order to provide accurate results without driving the entire Monte Carlo simulation. In the paper titled “A statistical method to identify main contributing tolerances in assemblability studies based on convex hull techniques” the authors propose a method to adopt a global sensitivity analysis based on deviation domains to obtain recommendations for optimizing tolerance values and apply the method to assemblability studies.

Tolerances influence the quality of manufacturing surfaces. The paper titled “Effects of geometric and spindle errors on the quality of end turning surface” develops an integrated volumetric error model applied to a lathe by the method of rigid body dynamics and homogeneous coordinate transformation. After the simulated surface is generated by a linear mapping of the volumetric errors on the ideal turning surface, the effect of volumetric errors on the precision and quality of the turning surface is analyzed. Since such errors affect quality, in the paper titled “An adaptive design method for understanding tolerance in the precision stamping process”, the authors propose an adaptable control method for tolerance. Fluctuations of tolerance are analyzed which are caused by precision stamping elements in the manufacturing process firstly. Then, the condition-driven adaptive control system is constructed based on the monitoring system and hydraulic control system. Thirdly, executive parameters (such as velocity, pressure, gaps, etc.) are calculated in the control module. Then stamping tolerances of precision parts are ensured accuracy in precision stamping process.

Linear convolution and morphological (nonlinear) operations are two kinds of operations that have wide applications in the field of surface measurement. The paper titled “A theoretical insight into morphological operations in surface measurement by introducing the slope transform” introduces a counterpart transform, called the slope transform, which provides the analytical ability for morphological operations and offers a deeper understanding of morphological operations in surface measurement. By investigating the slope and curvature change, the slope transform can offer a deeper understanding of morphological operations in surface measurement.

CIRP CAT 2014 was sponsored by the International Academy for Production Engineering, the School of Mechanical Engineering, Zhejiang University, and co-sponsored by the National Natural Science Foundation of China (NSFC). On behalf of the organizing committee, we appreciate all the support received from the keynote speakers and presenters. This support was indispensable for us to deliver a successful conference.