D3PLOT 22.1

Controlling the Die-Closure Calculation Process

Controlling the die-closure calculation process

Controlling the thickness used

By default the true thickness of both workpiece and die is used when calculating closure. "True" thickness depends on the element type:

(Thin) shells

The current element thickness as reported in the .ptf file. The centreline +/- half this value is used.

Solids and Tk. Shells Zero. (The nodes are assumed to lie on the face surface.)

For workpiece nodes the average thickness of all elements meeting at the node is used, for die facets the actual value is used.

For each side individually you can apply a factor to the "true" value, or override it with an explicit value. (You may wish to do this if you have used artificial thicknesses during the calculation, since this will affect contact geometry.)

OPTIONS... Controlling calculation parameters

The default parameters chosen by D3PLOT should be satisfactory in most cases, but there are situations when you may wish to alter them.

The OPTIONS... command gives access to the following adjustable parameters.

"Die facet overlap %age" controls the extent to which die facets are artificially enlarged (in the in-plane direction) for the purposes of checking workpiece node projection.

This is important where the die surface is convex with respect to the workpiece, since it helps to prevent workpiece nodes falling into the "tunnels" between projected facet volumes.

This can be illustrated as follows:



Using the Die facet overlap %age value to prevent nodes being lost on convex surfaces

When workpiece nodes are being projected onto a convex die surface problems can arise when nodes lie in the dead zones ("tunnels") where facets meet.

This is illustrated here: nodes marked * lie in the areas opposite die facets, and so "know" which facet they are likely to be projected onto.

Nodes marked + in the shaded tunnel areas don't "know" which facet they will be projected onto, and so no closure value is calculated for them.

Clearly this problem becomes more acute as the distance of a workpiece node from the die increases, and is the main reason why closure values for such distant nodes may appear randomly to be classified as "uncomputed", and given a value of -1.0.

The situation can be improved by artificially increasing the width of die facets so that they overlap, using the "Die facet overlap %age" factor.

This is illustrated here: a value of 10% (the default) has been used, and this shows how the "tunnel" dead zone has been made smaller, and also extended further away from the die surface.

Clearly larger values will lead to fewer nodes being lost in "tunnels", but this must be balanced against the fact that it may lead to invalid closure values being calculated when nodes are projected onto the wrong facets because these have become artifically extended.

This is a case for engineering judgement.

Using the Sort Bucket oversize %age to include distant die facets

The volume of space used for bucket sorting is based on the dimensions of the workpiece, not the die. Therefore if the die is much larger, or some considerable distance away, then there is a good chance that many facets on it will lie outside the bucket volume, and this means that they won't be considered when computing closure.

Increasing the Sort bucket oversize %age value from its default of 20% may include these ...
BUT:
  • Closure values calculated from distant facets are likely to be unreliable, since by the time the workpiece and die meet at that point their respective structures will probably have been deformed so much that contact will actually occur at some other location.
  • Increasing the bucket volume will increase the number of nodes and facets in each bucket. The time taken to compute closure rises as a function of ( N Workpiece * N Facet ) in each bucket, and these values increase as a function of bucket volume, ie linear dimension cubed. So increasing the bucket size can lead to a rapid rise in the time taken to compute closure, in the worst case by a fourth power.
Or, put another way, you need good reasons to increase this value!

Controlling the values used for uncomputed, uninvolved and distant nodes

These values do not affect the calculation at all, only the values assigned to special cases.

Set uncomputed to:

Is the value assigned to nodes on the workpiece for which no closure value can be computed. By default this is -1.0, but you can choose any value - although you should avoid values that might be confused with valid results (ie zero or small +ve numbers).

Set uninvolved to:

This is the value assigned to all nodes in the model that are not part of the workpiece. The default is -2.0, but again you can choose any sensible value.

Max distance val: Nodes with computed closure values greater than this value have them reset to the "uncomputed" value. By default this is +1.0e20, ie no nodes will ever fall into this category, but you can set it to a sensible upper-bound value.
The advantage of using negative values for uncomputed and uninvolved nodes is that they will never be valid closure distances, and so can be isolated during contouring by judicious choice of contour bounds, or excluded from plots altogether with the Limiting Values option in the CONTOUR menu.