23 May 2011

The space of all possible bridge shapes: Part 2

In the previous post I discussed Stephen Wolfram's proposition that there exists a space of all possible bridge designs, and that if we could use modern computer techniques to generate this using a set of simple rules, we could find new and unpredictable bridge forms within that space which may improve on traditional ideas.

A key challenge is how to find the better designs, a process which involves testing each option against whichever criteria are required. This can be computationally intensive, but in the age of cloud computing, becomes a little more feasible. Sensible engineers might reasonably object that to analyse a non-trivial array of structural models would defeat even the greatest computing resources currently available, and I would sympathise.

I wonder, however, whether this isn't primarily a flaw of traditional analytical technologies such as the finite-element stiffness matrix method.

I recall a project from some years ago (VISABO) which used Newtonian mechanics in a manner more closely related to Wolfram's cellular automata, exploiting "intelligent" structural elements each of which contained their own rules of physics, global behaviour emerging naturally from their relationships. This has the potential to allow the change of structural response resulting from a change in structural form to be analysed much more quickly: individual members react dynamically when another member is moved, added, or eliminated.

Another, perhaps more accessible example, is the series of Bridge Builder games (pictured above right), which appear to use the same principle (they certainly don't use finite element analysis!)

Closer to the professional arena, there is Daniel Piker's Kangaroo (pictured left), an add-on for Grasshopper / Rhino which carries out a similar physics-based simulation, and is being explicitly promoted for structural modelling purposes e.g. form-finding of catenary structures.

Nonetheless, I think that non-trivial analysis may still remain computationally too expensive, particularly for structures governed by continuum rather than discrete element behaviour (such as beams and frames), or where non-linear, dynamic or global buckling behaviour determine performance.

Analysis of the individual designs is only half the problem: it's also necessary to test them against pre-defined criteria to decide which are optimal (or, at least, superior to neighbouring designs). Researchers like Wolfram seem to believe that "economy" is readily measurable e.g. by least material. However, real-life economy in bridges is intimately linked to simplicity of construction, and structures which are regular and repetitious are generally cheaper to manufacture and assemble than those which are highly variable. A classic example is the simple rolled steel beam, which contains considerable quantities of material resisting very low stress, yet is almost always cheaper to supply than a latticework or variable section plate girder beam where the stresses have been made more uniform.

For trusses of the sort that Wolfram takes as his example, it is likely that his process will find an optimum 2-dimensional truss with irregular bay sizes or truss angles, reflecting the variation of shear; and with curved top and bottom chords, reflecting the variation of bending moment. But in 3-dimensions, truss members do more than carry shear and bending, they also resist out-of-plane buckling, and regular bays can make the deck design more economic. Curved chord members can similarly increase fabrication costs to a greater degree than the more uniform stress saves material. How then can economy be easily assessed?

If economy is difficult, what of robustness? How can that be readily measured in a manner which is quickly repeatable across a large array of possible designs?

To be continued ...

No comments: