[go: nahoru, domu]

Skip to main content
U.S. flag

An official website of the United States government

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

Advanced Composites Pilot for the Materials Genome Initiative

Summary

The development of new materials for diverse applications relating to the development of new energy sources, more efficient homes, autos, and aircraft, and the needs of environmental remediation and recycling all require a more efficient means to design material properties based on theoretical design rather than a traditional Edisonian trial and error approach or reliance on materials at hand regardless of whether the materials are the best choice for their intended use. Many of these materials require light weight, flexible and strong materials and this combination of properties inevitably leads to a consideration of polymer composite materials.  Manufacturing these materials is also a problem when the design space is so large – manufacturing process design and material design need to be done simultaneously, which is not the general practice at present.[1]

Polymer composites in general include three generic kinds of inclusions in a polymer matrix: fibers (particles extended in one dimension and compact in two dimensions), plates (particles extended in two dimensions and compact in one dimension) and fillers (particles that are compact in all three dimensions). Each kind of inclusion can have a different size and shape distribution, can be made from more than one kind of material, can be present in a variety of volume fractions (independent for each type), and all three kinds of inclusions can be simultaneously present. If there is a large size difference between two types of inclusions, the small inclusions can be thought of as changing the properties of the matrix in which the large inclusions are embedded. More generally, all the inclusions need to be simultaneously considered. The inclusion-matrix and inclusion-inclusion interfaces can play important roles in composite properties. This wide variety of inclusions and composite scenarios means that there is an enormous variety of advanced polymer composites that can be designed, forcing a more model-based means of design in place of a trial-and-error approach that is becoming increasingly unwieldy as material choices increase. 

Many industries are vigorously attempting to implement multi-scale composite modeling to design their materials and thus go beyond a trial and error approach. A survey of the most important users shows a deliberate reliance on commercial software. This reliance shows us that NIST should in general not be trying to develop new general-purpose models for industry to use, except where there is a gap in commercial software that we can fill with our software (e.g. suspension modeling). Therefore, we believe that we can best serve the MGI initiative by creating computational tools and standard model examples that can be employed to make the use of commercial software more realistic, flexible, and reliably accurate. The tools for calculating essential auxiliary material properties of polymer composite materials will be placed in the hands of engineers involved in materials development. These NIST-validated tools and standard model examples will give confidence to industrial model-users and will make their use of model-based development of new materials faster and more effective.

Description

Many computational programs exist that are being used by industry for predicting properties of polymer composites but their general predictive power is limited by how faithfully the microstructure is represented and how realistically the interfaces are handled. The computations are limited by the over-simple description of additive particle shape and dispersion. These models are also limited in their description of interparticle interactions, the intricate intertwining of particle property, shape, and changes in the polymer matrix that occur because of the addition of particles.

Our general approach directly addresses these issues, and is divided into four technical thrusts: particle shape, nanoparticle properties, and altered properties of the polymer matrix. The ultimate deliverable of this program is a toolbox, which will include these basic elements: finite element methods adapted to realistic shapes, path-integration methods (ZENO) for more complex particle shapes, and measurements and molecular dynamic simulations that can be used to gain insight into nanoparticle perturbation of the polymer matrix and its influence on dispersion. The NIST MGI toolbox for advanced polymer composites will offer an improved scheme for modeling particle additives more realistically. This capability will aid the particle characterization process and enable the calculation of effective properties using finite-element and path-integral and other methods in the toolbox list. The toolbox will also provide a framework for modeling dispersion factors and using them to optimize dispersion in order to optimize property changes.

I. Molecular Modeling of Polymer and Filled Polymer Systems 

Summary

Molecular issues related to interface and dispersion in polymer composites will be addressed through the development of a coarse-grained force field for polymer/nanofiller systems and an associated toolbox for calculating composite properties based on molecular potentials.

Background

The properties of polymer composites materials are highly dependent on molecular scale interactions between polymer and filler. The issue of fundamental importance is the polymer/filler interface. The presence of filler modifies the matrix properties, and the interface is defined as the region in the vicinity of the filler where the matrix properties are modified from the bulk. The ability to tailor the polymer/filler interaction so as to optimize the impact of the interface on macroscopic properties is the key to the intelligent design of composites.

A second fundamental issue is filler dispersion. Mechanical properties are enhanced and optimized when dispersion is uniform and stable, and filler aggregates are minimized. At a molecular level, filler dispersion becomes an issue because strong van der Waal forces between the particles tend to produced high levels of aggregation in the absence of strategies to modulate these interactions. A common and widely adopted strategy for controlling interparticle interactions is to graft short molecules or polymer chains to the filler surface. However, this approach also affects the interface, coupling the two problems. Finally, the polymer matrix itself also plays a role in dispersion (regardless of particle interaction properties) as dispersion is difficult to achieve in systems where the particle size is greater than polymer radius of gyration.

It is clear that the issues of interface and dispersion are fundamentally related and controlled by enthalpic and entropic interactions between filler and matrix which occurs at the molecular level. The need for molecular level models which enable fast and accurate simulation of such systems is therefore tacitly understood. However, the interface properties and dispersion are also the two key variables necessary for the multiscale modeling of polymer composites. Continuum level models which predict macroscopic properties rely on knowledge of the size of the interfacial region, the polymer properties in the interfacial region, and the distribution and properties of filler within the matrix. Therefore, the challenge is not only model these things at a molecular level, but to convert these outputs into a form suitable for multiscale linking. Computational tools and theories which accomplish this are therefore also an indispensable part of this effort.

Strategy and Work Plan

Molecular dynamics is a powerful tool for studying the effects of molecular level dispersion and interface properties in polymer composites. Two computational tools can be identified as fundamentally essential for enabling more accurate modeling in this area:

  1. A standardized, validated coarse-grained force field that accurately reflects the specific chemical properties of constituent polymers and particles, while maintaining accurate melt dynamics;
  2. A verified toolbox for characterizing dispersion and interface properties based on molecular potentials that can be linked macroscopic property models.

Force Field Development 

The need for a force field arises from several factors. First, the size and length scale of the constituent species in these systems necessitates the use of coarse-grained, rather than atomistic models, to study these systems due to the overwhelming system size when described in atomistic terms. However, standard, verified coarse-grained force fields for polymeric materials are not available in the same sense as they exist for atomistic MD calculations. This leads to either the use of generic models that are not quantitative, or models that are developed ad-hoc for specific systems, with very little general applicability, and little to no dissemination. The challenge in this area is to develop a standardized and verified coarse-grained Force Field that accurately reflect the specific chemical properties of constituent polymers and particles, while maintaining accurate melt dynamics. This is especially important for composite systems where the specific nature of the polymer/filler interactions plays a fundamental role in the properties of the interface.

The Kremer-Grest (K-G) model [refs. 1986, 1990] is the most widely used coarse-grained MD model for polymer melts and provides an excellent description melt dynamics for flexible polymer chains both below (Rouse dynamics) and above (reputation dynamics) the entanglement molecular weight (where MW is expressed in the number of beads). It is desirable to develop an extended K-G type model in a way that accounts for differing levels of chain stiffness, chemical functionality, or morphological features such as tacticity which would make the coarse-grained model a meaningful representative of a specific polymeric species, while preserving appropriate melt dynamics.

Strategies for making progress in accurately modeling polymeric systems using coarse-grained MD can be garnered by examining existing coarse-grained models used in phase equilibria and colloidal systems. In such models, coarse-grained particles represent different types of chemical moieties and map these atomistic entities up to a 4:1 basis (not counting hydrogen's). In polymeric systems, significantly higher levels of coarse-graining will be needed than in colloidal systems, ranging from 10:1 to 100:1 (in terms of repeat units) depending on molecular weight. Such "super" coarse-graining is probably most easily achieved through a progressive system, where starting with an atomistic description, species are systematically coarse-grained at 2:1 ratios. Parameters will be developed and stored at each level, enabling modelers to choose between finer or coarser levels, depending on the particular application.

In addition, it is desirable to incorporate from the colloidal work the idea of standard particle types which would make extensibility to different polymer types more efficient. For example, in certain colloidal models, coarse-grained particles representing atomistic chemical moieties that are classified into four basic types: Polar (hydrophilic), Apolar (hydrophobic), Neutral and Charged. These types are further sub-classified and parameterized in terms of Lennard-Jones pair potentials that represent interactions strengths ranging from "supra attractive" to "repulsive". Based on these concepts, a strategy for constructing a modified Kremer-Grest type model that incorporates details about the nature of monomer-monomer and filler-monomer interactions can be developed. The force field will be written in Chemical Markup Language (CML) which will allow it to be adapted in databases and applications which build ensembles for molecular modeling. 

Property Toolbox  

Another fundamental need in this area is the development of a verified property calculation toolbox to characterize interface and dispersion properties, and which can be used as a basis for multiscale linking with continuum models. A number of properties can be quickly elucidates as essential for such a toolbox:

Property Method/Technique Category Multiscale Link

Tg NPT Density vs. T sweep Interface/Bulk Mechanical Properties Modulus Various Interface/Bulk Mechanical Properties Viscosity Green-Kubo Melt CFD/processing Linear Viscoelasticity GSER/Microrheology Interface/Bulk/Melt Multiple Aggregation Second Virial Coefficient Dispersion Mechanical Properties Free Energy/Chi Widom Insertion Thermodynamics CFD/processing 

Part of the work in this area will be to survey the literature to determine a complete list of essential properties, and then survey literature to determine the most accurate and efficient methods. Code will be written where necessary in a language (or multiple languages) to maximize dissemination.

II: Particle Shape, Intrinsic Properties and Interparticle Interaction

Summary

We will create procedures for acquiring realistic information about particle shape and to perform calculations on the real rather than assumed shape of the additive particles. This will require an integration of experimental and computational methods to acquire this information, model the properties of the particle dispersion, and validate the simulation against input observations. The entire process, including accepting shape information from a variety of formats and computing the effects of particle shape, will be integrated into a software package that can be used by industry to make their use of commercial software more realistic and effective.

Background

A composite material consists of a matrix and particle inclusions. When there is only a small volume fraction of particles, called the dilute limit, basic composite properties P can be written as a virial expansion: P = Po [1 + [P] j + O(j2)], where j is the particle concentration. P can be the shear viscosity h, conductivity s and shear G and bulk K moduli of the composite and the relevant virials are the intrinsic viscosity [h ],the intrinsic conductivity [s]o of insulating particles in a relatively conductive matrix, the intrinsic conductivity of highly conducting particles in a relatively insulating matrix [s]¥ , the intrinsic shear modulus [G], nanoparticle diffusion coefficient [D], intrinsic bulk modulus [K], intrinsic Poisson ratio [n], intrinsic dielectric constant [e], intrinsic refractive index [n], intrinsic magnetic permeability, intrinsic hydrodynamic effective mass [M], and others. These virial coefficients are functions of particle shape, the interface between particle and matrix, and the property contrast between the particle properties, which is defined by the property ratio, Dp = Pparticle / Pmatrix .

Current computations of these important intrinsic properties are limited to simple shapes like spheres and ellipsoids. But theoretical modeling of the properties of composites can only be a good as the modeling of the structure and properties of the particles added to the polymer medium, along with the complex fluid properties of the polymer medium itself. This type of characterization is extremely computationally intensive, explaining why this type of basic characterization is often neglected. This effort will directly address this problem through advanced computational techniques such as path integration and massively parallel finite element techniques, which enable the modeling the properties of realistic particles and polymers. The overall goal is to provide this powerful tool to industrial scientists as a direct input into the commercial software they use for the facile calculation of properties of interest and the development of new materials.

Strategy and Work Plan

We will first focus on the calculation of basic properties of advanced composites and the general problem of how these properties depend on filler shape in relation to the surrounding matrix in the dilute particle limit where dispersion is not important. The resulting property changes provide a useful metrology for particle shape and the intrinsic properties of the particles themselves if the particle shape is known from independent imaging observations. The metrologies revolve around studying through measurement and simulation how the particles alter the properties of the polymer material in the low concentration limit. Realistic models in the concentrated particle regime depend on knowing the dilute limit accurately and the dilute limit plays an essential role in characterizing the particles and the mutual interactions.  

Mathematically these property calculations are 'classical' in that they involve the solution to well-known equations of continuum mechanics (Laplace, Poisson, Navier-Stokes, Kelvin, Maxwell), but the treatment of complex particle shape and boundary data is intractable analytically and even challenging for any existing computational resources. The basis of our approach to this general problem is the use of efficient finite element and boundary element codes adapted to treat boundary data (some of which are available commercially), along with newly developed path-integral algorithms involving averaging over random walk paths that allow us treat this type of problem. We will develop a general computational package that calculates these intrinsic property changes associated with adding particles to polymer matrices, although molecular dynamics simulations may be required to treat the effects of glass-forming and entangled polymer matrix effects.

The path-integration calculation of composite properties computational method (called ZENO) involves enclosing an arbitrary-shaped probed object within a sphere and launching random walks from this sphere. The probing trajectories either hit or return to the launch surface ('loss') as shown in the figure for a model soot particle aggregate, whereupon the trajectory is either terminated or reinitiated. Zeno permits great flexibility in defining particle geometry, e.g. , beads, cylinders, ellipsoids, and surfaces with triangulated surfaces, so as to allow more physically realistic modeling of particle structure. Zeno is computationally faster than competing methods for complex geometries and is completely parallel.

Most methods have computational times O(n3) where n is the number of body elements, but Zeno computational times are O(n). This is a serious factor for complex bodies where n is large and for random objects where ensembles of objects must be generated and sampled. Finite element and finite difference techniques can be used effectively for a larger range of intrinsic particle properties but for a smaller range of shape. However, the shape of random star-shaped particles, which covers a very wide range of industrial fillers, can be handled with spherical harmonic techniques that are adaptable to finite element techniques, typically on small to medium-size parallel computers.

The software package that will be assembled in this part of the Materials Genome Initiative program will be used in multiple modalities in materials design:

  1. The program will guide how to choose particle shape and the property materials from which they are composed to make a composite material whose properties are in a target range for a given application. Often this design involves an optimization problem because a beneficial property change in one property can lead to a detrimental change in another property. The user of the program can explore the impact of factors that he has under control in the development of his product.
  2. One of the greatest problems in modeling polymer composite materials when nanoparticles are used is that the small size of the particles can change their properties relative to a bulk material and it is rather inconvenient to determine these modified particle properties. If the shape of the nanoparticle is independently determined, e.g. by TEM, then it is possible to determine the property of the particle itself from measurements of the intrinsic properties of the particle.
  3. Intrinsic properties are very useful potentially in characterizing the mean dimensions and average shape of nanoparticles and indeed essentially any object. Variational principles exist associated with these fundamental properties that establish a systematic shape classification metrology. To be specific, it should readily possible to determine the length of particles like carbon nanotubes from intrinsic viscosity or intrinsic conductivity measurements of suspensions of these particles. This project will help establish a routine measurement-based metrology for nanoparticle shape using mathematical underpinning that will allow this type of characterization to be made more facilely than at present, where very crude models are the norm.
  4. Intrinsic property measurements are fundamental for characterizing how nanoparticles perturb the local properties of the property matrix to which they are added. Molecular dynamics simulations are absolutely necessary for understanding these effects relating to the change of the interfacial boundary conditions about the particles. We plan to introduce models of composites that approximately incorporate the interfacial zone property changes and spatial extent, this information being supplied to the model from experiment and molecular dynamics simulation. This will provide a toll for assessing and understand property changes in composites where the nanoparticles strongly perturb the matrix to which they are added.

Although the preliminary ZENO program at NIST offers many of the required properties for this proposed software package, there are still many gaps that limit its effective application to problems of composite design. The problem of calculating properties with certain boundary conditions (e.g. insulating particles in conductivity calculations) to high accuracy has not yet been solved. This is a non-trivial problem requiring a post-doc with advanced mathematical training in probabilistic potential theory and computational mathematics. It is also important to develop general purpose algorithms for the treatment of osmotic virial coefficients of nanoparticles having general shape to establish a metrology for nanoparticle interaction. Some programs exist for this purpose in simple molecular fluids and some significant theoretical and software development will be required to develop these tools that the process engineer could use in his work. It will be necessary to create an interface program similar in general character to OOF or the Virtual Cement and Concrete Testing Laboratory (VCCTL) to allow facile exploration of theoretical structure property relations and which provides adequate visualization capabilities to aid the industrial researcher engaged in material design.

It is important that information about particle shape be able to be injected into the program starting from direct observation or simulation from a variety of sources. Standardized formats will be required, integrating various models relating to particle representation to create many options for the user for introducing this information into the software. We anticipate significant help by ITL in this effort creating standard objects for particle description. The existing path integration capability of ZENO will be integrated with existing finite and boundary element programs for the properties required, depending on computational time and accuracy requirements of required calculations. Finally, a vital part of this research program is the validation of the computational tool against both molecular dynamics simulation and measurements on model fluids. The experimental and MD simulation components of this program are also essential for the success of this effort.

This project would optimally require at least 3 or 4 postdocs over the next few years for this project to be developed into a form of useful to industry for material design. The initial phase of this effort will focus on an effort to adapt the ZENO program and interface to the needs of the composite community. In this initial phase of the project, effort will also be made to: take stock of the properties that need to be added to ZENO that will be useful for the polymers processing community, improve the ease of use of the program, and establish a database of calculations that could be used a library for industrial researchers. Currently, the library of molecules treated includes most structures in the protein data bank, but work is needed to initiate databases for other structures such as carbon nanotubes, graphene, and especially polymer functionalized nanoparticles of industrial interest such as quantum dots and gold nanoparticles for which no other accurate method of computational characterization of transport properties currently exist. This effort will set the stage for the development of new extensions of ZENO that will expand the applicability of the method to describe composite properties.

III: Dispersion Rheology

Summary

The goal of this project is to establish an interactive database to provide rheological properties of polymer composites, where interaction among inclusions, inclusion particle shape, inclusion particle surface modification, and matrix fluid properties are inputs. The database will serve as a tool to aid industry in their use of commercial software to design new materials, improve quality control in manufacturing, and help reduce time from the initial idea and design to production of goods. This database will be tied into the tools being developed in other parts of this project for analyzing the effects of particle shape on composite properties. Rheology is one more of these properties, albeit one that is needed at the composite processing step.

Background

The flow of polymer composites via injection molding, spraying or spreading on surfaces is crucial to a wide variety of industrial applications. For single-phase liquids, there is already a very wide range of flow properties possible, dependent on molecular interactions, temperature, and other variables. For polymer composites, adding particles of various shapes, with varying (and tunable) interactions with each other and with the matrix fluid, makes the potential intrinsic rheological problem even more complex. To optimize flow conditions and improve quality control in manufacturing it is important to predict the flow of the polymer composite in complex geometries, which also depends on the polymeric material's rheological properties as input into commercial CFD-based models used by industry. However, it remains a great challenge to predict the rheological properties of complex fluids like polymer composites as they are highly dependent on the nature of the matrix fluid, particle inclusion volume fraction, shape, size distribution, and particle/matrix and particle/particle interactions. Some of these difficulties are being met through sophisticated techniques such as dissipative particle dynamics and smooth particle hydrodynamics to handle the rheology and spherical harmonic techniques to handle a wide range of random particle shape. Codes and techniques like these need to be used to systematically compute rheological properties for a wide range of particle shapes, size distributions, surface modifications, interactions, and matrix types. Then these results need to be systematically organized and correlated so that they can be easily used in commercial software in order to make the results of commercial software more realistic.

Strategy and Work Plan

The goal of this project is to establish an interactive database to provide rheological properties of polymer composites, where interparticle interaction, particle shape, surface modification and matrix fluid properties are inputs. The database will serve as a tool to aid industry in their use of commercial software to design new materials, improve quality control in manufacturing, and help reduce time from the initial idea and design to production of goods.

The strategy is to systematically use techniques like dissipative particle dynamics, smooth particle hydrodynamics, and spherical harmonics to generate a database that includes the effect of particle size, shape, interaction with each other and with the matrix fluid on rheological constituitive relations (e.g. viscosity vs. shear rate). The ranges of fluid and of particles will cover the ranges typically used in industry, and will in fact be informed by input from the industrial partners involved in this entire polymer composites Materials Genome Initiative program. This range of fluid and particle types is denoted a "canonical" set below. This is a multi-year project, with variables being continuously added from year to year. However, the interactive database, with incomplete information, will be available after the first year for industrial comment.

In FY2013, we will evaluate the relative viscosity (i.e. viscosity of composite normalized by the viscosity of the matrix fluid) of suspensions composed of a canonical set of particle shapes, size distributions and volume fractions for Newtonian fluid matrices. We will also establish the database IT structure needed, with an emphasis on ease of use, coupling to existing commercial software, and expandability. This initial computational structure will be delivered to our industrial partners for evaluation, and will be built and maintained by the ITL portion of the Materials Genome Initiative at NIST. The effect of volume fraction will couple well with the technical thrust on particle shape, intrinsic properties and interparticle interaction, which focuses on the dilute particle limit and on non-rheological properties.  In FY2014, we will add the effects of particle surface modification on composites using Newtonian fluid matrices, while maintaining the variables of particle shape, size distribution, and volume fraction. In the following years, FY2015 and FY2016, we will add to the interactive database the effects of a non-Newtonian fluid matrix and a non-Newtonian viscoelastic fluid matrix.

Technical Thrust IV: Macroscopic Properties of Particle-Reinforced Composites

Summary

Predictions for Properties and Use Limit or Lifetime of Particle-Reinforced Composites

Strategy and Work Plan  

Building Blocks: matrix (continuous phase) and particles (dispersed phase). Smaller inclusions (nano size particles) are treated as changing properties in matrix that are obtained from the Technical Thrust I-III for Polymer nanocomposites.

Variables (properties of composites depend on):

  • Properties of phases
  • Amount of phases
  • Architecture of dispersed phase (particles size, shape), distribution of dispersed phase
  • Property of interphase

Composite Properties Prediction– single scale modeling

  • Using Monte Carlo Methods – get arrangement and distribution of particles (i.e., possible structures of composite material) 
  • Predicting composite properties as a function of structure (e.g., mechanical, thermal and electrical conductivity, permeability, moisture absorption, recycling, etc.) through homogenization processes, where the material can be idealized as being effectively homogenous in a representative volume element (RVE). The assumption of material continuity is required in order to develop structure–property relationships 
  • Solution method: Coupling of Monte Carlo method with the phase-field method (desired) or FEM

Composite Use Limit or Lifetime Prediction – Multiscale Modeling (transferring information between different length scales rather than coupling different simulation techniques)

  • Prediction of Interphase properties between matrix and particles (e.g., strength and toughness if mechanical use limit or lifetime is concerned); using MD simulation – a micro scale – passing this information to the modeling at the next length scale (meso scale)
  • Obtain the failure surface of interphase under multiaxial loadings – using an analytical method, then passing to next length scale (macro scale)
  • Together with a given structure of composite materials and the failure surface to assess the evolution of damage during service of composite materials
  • Solution method: MD simulation; couple statistical characteristics of composite constituent (Monte Carlo method) with classic fracture mechanics (an analytical method or FEM)  

Besides particle size and distribution, tortuosity of particle distribution can be also used as an adjustable parameter in models of transfer and electrical properties, and damage evolutions of composite materials. For example the crack path tortuosity is increased as the crack bypasses the particles of dispersed phase (rigid or soft).

Known property values of particles, matrix and interfaces A statistical geometry distribution of particles (size) (using statistical models for microstructure; Monte-Carlo method) Mechanical and Physical Properties (stiffness, hardness, transfer and damage) phase field theory, burning theory Optical Properties (light-cure efficiency, depth of cure) Monte-Carlo method and light propagation theory Electrical and Thermal Properties (conductivity, shrinkage) For a given composition Statistical Distribution of Mechanical and Physical Properties Optimization of Mechanical and Physical Properties Experimental Validation [1] A third industrial revolution: Special report on Manufacturing and Innovation, April 21, 2012, The Economist.

Created June 28, 2013, Updated October 14, 2021