New Delhi: Materials are the face of our civilization and the techno-social growth is hinged to the ground-breaking discovery of materials and manufacturing methods. We and our surroundings are all made of materials and the next generation technology drives a new generation of materials and manufacturing.
Due to the application of advanced computational methods in engineering, sophisticated synthesis and faster testing methods, development cycle of new materials has come down from a decade to a couple of years. However, we have not reached where we can design digitally and realize a new material in the lab or in industry setting, without foregoing wholesome trials.
New paradigm in material design
Design of new materials is well beyond a simple concept of mixing its constituents. For example, the most used engineering alloys e.g. steel, super alloys and titanium alloys have many constituent elements like iron, carbon, nickel, chromium, titanium, vanadium and so on. One needs to understand the temperatures at which these elements melt and mix, how fast to cool the liquid, how to form into various usable shapes, understand intermediate heating steps, and so on. Iterative design of new alloys by understanding various permutations and combinations of compositions and process variables falls in the scope of metallurgy.As evident from the aforementioned, the conventional alloy design process is multistage and thus multivariable, leading to tremendous effort. Further, alloy design is also about meeting a well-defined target of properties like yield strength, ultimate tensile strength, fracture toughness, fatigue life etc. This is only possible when along with a fine selection of constituents and a sequence of processes are also followed. It is not very much different from making your favourite dish. And last, but not the least, it must be served for a specific meal at the specific conditions. The latter is akin to using the material for the correct application. A relatively new paradigm in material design, that makes things easier, is the application of machine learning (ML) and artificial intelligence (AI). These new concepts are pushing the boundaries using the newest form of algorithms e.g. generative AI for process design and materials, learning new materials by harnessing the knowledge from data. Such algorithms are not limited to finding relations in compositions-process-property-performance, but are also able to suggest the next experiments or modifications in the composition and process that lead to the next generation of materials.
The overhead of development of new materials is significant and the ML approach can reduce the cost by great margins. The data driven ML models have good predictability in the range of data for which they are trained. At the same time, they are also very fast since they are able to extract the information based on ranking of features i.e. the information that influences most. A model based on few selected important features is able to approximate the truth reasonably well and at the same time, is high performing. An important factor in cost reduction is on-the-go data collection from experiments and manufacturing.
Based on several studies it is established that for science, one needs reliable data instead of a very large amount of data. For timely feedback and a decent quantity of data, high fidelity and high throughput experiments have been very useful. Small-scale mechanical testing at faster rate is possible due to advances in automation, new testing concepts that are applicable for small size specimens, modelling and application of data science concepts.
Small scale testing is particularly useful for expensive processes e.g. additive manufacturing, multistage metal forming, layer-by-layer manufacturing at various length scales. The latter two examples are of particular interest due to non-uniform properties in the final product. In some cases, even the conventional process produce non-uniform properties as they are required by design. Compositionally graded materials – in which the composition varies with the given dimensions of the product / component, are good examples of these.
Figure: Materials innovation finds the centerstage of innovation in the next generation technology. Current material, process and performance data are important for continuous improvement and for the innovations in manufacturing.
Predicting next processes
Similarly high throughput characterization of the materials is very critical. Application of scanning electron microscope (SEM) has been very useful since it can gather meaning data at mesoscale (microns) in short time, thanks to the automation and ML assisted algorithms for analyses.
ML method can eliminate long hit-and-trial-based expensive methodology of material design to smart discovery and deployment. A tangible effort is going on across the globe to curate, capture, generate and manage data and to come up with algorithms to predict the next processes and materials.
The materials and manufacturing community is continuing to witness a remarkable progress in synthesis and characterization targeting multicomponent materials e.g. high entropy alloys (HEAs). Such materials have a wide spectrum of compositions, phases, and properties but a very sparse dataset available currently.
In spite of current advances, there is a dire need of new materials to fulfil the growing demand of emerging technology with stringent regulations of carbon footprints, energy efficiency and environment conservation. This demand still wins against the evolving gamut of data science, machine learning, computer vision and generative AI. The community strives to holistically capture the full property hull of a given class of materials for addressing the target performance.
There is no doubt that AI/ML assisted modelling methodology is promising. The challenge remains for collecting reliable data, error quantification, uncertainty quantification, interpretable machine learning and applicability of ML models or transferability of knowledge from one material system to another material system. Similar challenge also remains for transferring small test data to component scale. A model is really useful when it is able to predict in real time – the part which is still in its infancy as a technology. This indicates the need of physics-based models that can fill the sparse dataset or can help generate hybrid models that are more inferable.
(Disclaimer: Prof. Alankar is Associate Professor, ICME and Materials Genome (ImaGen) Lab, Department of Mechanical Engineering, IIT Bombay. His team has been working on innovations in accelerated material design and digital manufacturing. Views in this article are personal.)