Saturday, September 09, 2006

We are migrating to iMechanica.Org

Friday, September 08, 2006

Making Sense of the Web's Structure

Sunday, September 03, 2006

Scientists in Britain report baldness breakthrough

Saturday, September 02, 2006

Should anyone write a Wiki textbook on Mechanics of Materials ?

Here is an article on Wiki-style textbook in general.

My question is: Can we do something to help students in developing countries get free textbooks on applied mechanics ? Should NSF support such effort ?

Gene therapy breakthrough against skin cancer

Tuesday, August 29, 2006

A New Class of Composite Materials - Graphene-based Composite Materials

Professor Rodney Ruoff and colleagues at Northwestern University and Purdue University have developed a process that promises to lead to the creation of a new class of composite materials - graphene-based materials. They reported the results of their research in Nature, 442 (2006) 282-286. This team has overcome the difficulties of yielding a uniform distribution of graphene-based sheets in a polymer matrix. Such composites can be readily processed using standard industrial technologies such as moulding and hot-pressing. The technique should be applicable to a wide variety of polymers. The graphene composites may compete with carbon nanotube-based materials in terms of mechanical properties. This new class of composites may stimulate the applied mechanics community to study the fundamental reinforcing mechanisms of graphene sheets from both experimental and theoretical approaches.

Tony Evans Elected a Fellow of the Royal Academy of Engineering

Saturday, August 26, 2006

The drama of a mathematical proof

An article in this week’s New Yorker describes the human drama behind a proof of the Poincare conjecture, one of the seven Millennium Problems. The article is unsparing of several mathematicians of Chinese origin.

Notes added on 27 August.

Tuesday, August 15, 2006

R.S. Rivlin is awarded the Engineering Science Medal

Young Huang is awarded the 2006 SES Young Investigator Medal (University of Illinois at Urbana-Champaign)

Professor Young Huang of University of Illinois at Urbana-Champaign is awarded the 2006 Society of Engineering Science (SES) Young Investigator award for his contributions on strain-gradient plasticity theory, computational nano-mechanics, and computational dislocation dynamics and plasticity.

Alan Needleman is awarded the 2006 Prager Medal (Brown University)

Professor Alan Needleman of Brown University is awarded the 2006 Prager Medal for his contribution in computational failure mechanics, dislocation dynamics, modeling and simulations of fractures and strain localizations, plasticity theories, and most recently contact and adhesion at small scales.

Saturday, August 12, 2006

Meshfree Methods: Its origin, history, and theories

The following article is written by Dr.-Ing. Timon Rabczuk, who is now teaching finite element methods and meshfree methods at Technical University of Munich, Germany, for a sister blog --- Meshfree Methods Blog.
I enjoyed reading the article very much, and would like to share it with you.


Meshfree methods go back to the seventies. The major difference to finite element methods is that the domain of interest is discretized only with nodes, often called particles. These particles interact via meshfree shape functions in a continuum framework similar as finite elements do although particle “connectivities” can change over the course of a simulation. This flexibility of meshfree methods was exploited in applications with large deformations in fluid and solid mechanics, e.g. free-surface flow, metal forming, fracture and fragmentation, to name a few. Most meshfree methods are pure Lagrangian in character though there are a few publications on meshfree methods formulated in an Eulerian (or ALE) description, e.g. Fries 2005. The most important advantages of meshfree methods compared to finite elements are: their higher order continuous shape functions that can be exploited e.g. for thin shells or gradient-enhanced constitutive models; higher smoothness; simpler incorporation of h- and p-adaptivity and certain advantages in crack problems (no mesh alignment sensitivity; some methods do not need to enforce crack path continuity). The most important drawback of meshfree methods is probably their higher computational cost, regardless of some instabilities that certain meshfree methods have.

One of the oldest meshfree methods is the Smooth Particle Hydrodynamics (SPH) developed by Lucy and Gingold and Monaghan in 1971. SPH was first applied in astrophysics to model phenomena such as supernova and was later employed in fluid dynamics. In 1993, Petschek and Libersky extended SPH to solid mechanics. Early SPH formulations suffered from spurious instabilites and inconsistencies that were a hot topic of investigations, especially in the 90s. Many corrected SPH versions were developed that improved either the stability behavior of SPH or its consistency. Consistency, often referred to as completeness in a Galerkin framework, means the ability to reproduce exactly a polynomial of certain order. A method is called n-th order consistent (or complete) if it is able to reproduce a polynomial of order n exactly. While most SPH methods are based on the strong form, a wide class of methods was developed based on the weak form.

Based on an idea of Lancaster and Salkauskas and probably motivated by the purpose to model arbitrary crack propagation without computational expensive remeshing, the group of Prof. Ted Belytschko developed the elementfree Galerkin (EFG) method in 1994. The EFG method is based on an MLS approximation and avoids inconsistencies inherent of some SPH formulations. In 1995, the group of Prof. W.K. Liu proposed a similar method, the Reproducing Kernel Particle Method (RKPM). Though the method is very similar to the EFG method, it originates from wavelets rather than from curve-fitting. The first method that employed an extrinsic basis was the hp-cloud method of Duarte and Oden. In contrast to the EFG and RKPM method, the hp-cloud method increases the order of consistency (or completeness) by an extrinsic basis. In other words, additional unknowns were introduced into the variational formulation to increase the order of completeness. This idea was later adopted (and modified) in the XFEM context though the extrinsic basis (or extrinsic enrichment) was used to describe the crack kinematics rather than to increase the order of completeness in a p-refinement sense. The group of Prof. Ivo Babuska discovered certain similarities between finite element and meshfree methods and formulated a general framework, the Partition of Unity Finite Element Method (PUFEM), that is similar to the generalized Finite Element Method (GFEM) of Strouboulis and colleagues. Another very popular meshfree method worth mentioning is the Meshless Local Petrov Galerkin (MLPG) method developed by the group of Prof. S.N. Atluri in 1998. The main difference of the MLPG method to all other methods mentioned above is that local weak forms are generated over overlapping sub-domains rather than using global weak forms. The integration of the weak form is then carried out in these local sub-domains. In this context, Atluri introduced the notion “truly” meshfree methods since truly meshfree methods do not need a construction of a background mesh that is needed for integration.

The issue of integration in meshfree methods was a topic of investigations since its early times. Methods that are based on a global weak form may use three different types of integration schemes: nodal integration, stress-point integration and integration (usually Gauss quadrature) based on a background mesh that does not necessarily need to be aligned with the particles. Nodal integration is from the computational point of view the easiest and cheapest way to build the discrete equations but similar to reduced finite elements, meshfree methods based on nodal integration suffer from an instability due to rank deficiency. Adding stress points to the nodes can eliminate (or at least alleviate) this instability. The term stress-point integration comes from the fact that additional nodes were added to the particles where only stresses are evaluated. All kinematic values are obtained from the "original" particles. The concept of stress points was actually first introduced in one dimension in an SPH setting by Dyka. This concept was introduced into higher order dimensions by Randles and Libersky and the group of Prof. Belytschko. There is a subtle difference between the stress point integration of Belytschko and Randles and Libersky. While Randles and Libersky evaluate stresses only at the stress points, Belytschko and colleagues evaluate stresses also at the nodes. Meanwhile, many different versions of stress point integration were developed. The most accurate way to obtain the governing equations is Gauss quadrature. In contrast to finite elements, integration in meshfree methods is not exact. A background mesh has to be constructed and usually a larger number of quadrature points as in finite elements are used. For example, while usually 4 quadrature points are used in linear quadrilateral finite elements, Belytschko recommend the use of 16 quadrature points in the EFG method.

Another important issue regarding the stability of meshfree methods is related to the kernel function, often called window or weighting function. The kernel function is somehow related to the meshfree shape function (depending on the method). The kernel function can be expressed in terms of material coordinates or spatial coordinates. We then refer to Lagrangian or Eulerian kernels, respectively. Early meshfree methods such as SPH use an Eulerian kernel. Many meshfree methods that are based on Eulerian kernels have a so-called tensile instability, meaning the method gets unstable when tensile stresses occur. In a sequence of papers by Belytschko, it was shown that the tensile instability is caused by the use of an Eulerian kernel. Meshfree methods based on Lagrangian kernels do not show this type of instability. Moreover, it was demonstrated that for some given strain softening constitutive models, methods based on Eulerian kernels were not able to detect the onset of material instability correctly while methods that use Lagrangian kernels were able to detect the onset of material instability correctly. This is a striking drawback of Eulerian kernels when one wishes to model fracture. However, a general stability analysis is difficult to perform and will of course also depend on the underlying constitutive model. Note also, that Libersky proposed a method based on Eulerian kernels and showed stability in the tension region though he did not consider strain softening materials. For too large deformations, methods based on Lagrangian kernels tend to get unstable as well since the domain of influence in the current configuration can become extremely distorted. Some recent methods to model fracture try to combine Lagrangian and Eulerian kernels though certain aspects still have to be studied, e.g. what happens in the transition area or how are additional unknowns treated (in case an enrichment is used).

In meshfree methods, we talk about approximation rather than interpolation since the meshfree shape functions do not satisfy the Kronecker-delta property. This entails certain difficulties in imposing essential boundary conditions. Probably the simplest way to impose essential boundary conditions is by boundary collocations. Another opportunity is to use the penalty method, Lagrange multipliers or Nitsche’s method. Coupling to finite elements is one more alternative that was extensively pursued in the literature-in this case, the essential boundary conditions are imposed in the finite element domain. In the first coupling method by Belytschko, the meshfree nodes have to be located at the finite element nodes and a blending domain is constructed such that the shape functions are zero at the finite element boundary. In this first approach, discontinuous strains were obtained at the meshfree-finite element interface. Many improvements were made and methods were developed that exploit the advantage of both meshfree methods and finite elements, e.g. the Moving Particle Finite Element Method (MPFEM) by Su Hao et al. or the Reproducing Kernel Element Method (RKEM) developed by the group of Prof. W.K. Liu. Meanwhile, several textbooks on meshfree methods have been published, W.K.Liu and S. Li, T. Belytschko, S.N.Atluri and some books by Prof. G.R. Liu.

Many meshfree methods were developed and applied in fracture mechanics to model arbitrary crack growth. The crack was initially modeled with the visibility criterion, i.e. the crack was considered to be opaque and the meshfree shape functions were cut at the crack surface. Later, the diffraction and transparency method was used instead of the visibility criterion since they remove certain inconsistencies of the visibility criterion. With the development of the extended finite element method (XFEM) in 1999, meshfree methods got a very strong competitor. The major drawback of meshfree methods with respect to XFEM is their higher computational cost. It is also less complex to incorporate XFEM into existing FE codes. There are still some efforts to modify meshfree methods with respect to material failure and fracture. However, it seems that much less attention is paid to the development of meshfree methods these days compared to the 90s. Nevertheless, meshfree methods still are applied frequently in many different areas, from molecular dynamics, biomechanics to fluid dynamics.

Tuesday, August 08, 2006


Rank | Company| Headquarters | 1st half 2006 sale ($M)|

1 -- | Intel | U.S. | $ 15,255 |
2 -- | Samsung | South Korea | $ 8,946 |
3 -- | TI | U.S. | $ 6,765 |
4 -- | TSMC | Taiwan | $ 4,911 |
5 -- | Infineon | Europe | $ 4,872 |
6 -- | ST | Europe | $ 4,854 |
7 -- | Toshiba | Japan | $ 4,471 |
8 -- | Renesas | Japan | $ 4,013 |
9 -- | Hynix | South Korea | $ 3,157 |
10 -- | Freescale| U.S. | $ 3,000 |

From ElectronicNews

Thursday, August 03, 2006

Meshfree Methods blog

A blog dedicated to Meshfree Methods has recently been set up by the USACM Specialty Committee on Meshfree Methods. This was inspired in no small part by the work of Professor Zhigang Suo and colleagues on the Applied Mechanics blogs.

Friday, July 28, 2006

Virtual Journal of Nanoscale Science & Technology

VJ virtual journal of nanoscale science and technology is a weekly virtual journal that contains articles that have appeared in one of the participating source journals and that fall within a number of contemporary topical areas in the science and technology of nanometer-scale structures. The articles are primarily those that have been published in the previous week; however, at the discretion of the editors older articles may also appear, particularly review articles. Links to other useful Web resources on nanoscale systems are also provided.

The journal provides a quick update of the current research in nanoscale science and technology. If your article is publsihed by a participating journal and it also appears in VJ, you may consider that your work has received some attentions, to say the least. You can sign up for FREE content alerts.