Researchers develop advanced modelling to improve biomass milling efficiency

As demand grows for renewable feedstocks made from wood chips, crop residues and municipal waste, the U.S. Department of Energy’s Bioenergy Technologies Office (BETO) has identified material handling as a key barrier to commercial-scale production.
Unlike minerals or food powders, biomass particles vary widely in shape, density and internal structure, making them difficult to mill consistently.
Clogged equipment and uneven particle sizes remain persistent challenges for industry.
To address this, INL researchers have applied computational methods commonly used in other industries to bioenergy feedstocks.
Funded by BETO through the Feedstock-Conversion Interface Consortium, the team has developed models that help predict how biomass behaves during size reduction, allowing engineers to design more efficient processes.
“These tools provide insights that can guide the development of more energy-efficient and effective milling strategies,” said Yidong Xia, senior research scientist at INL.
Recent publications from Xia’s team highlight how emerging computational approaches - including discrete element modelling and machine learning - can be used to solve long-standing problems related to particle size distribution and thermochemical conversion.
The research focused on corn stover, a widely available agricultural residue.
Unlike uniform mineral particles, corn stalks and leaves have complex internal structures, making them unpredictable during milling.
Experiments at INL’s Process Development Unit (PDU) - part of the Department of Energy’s Biomass Feedstock National User Facility - were used to generate data for the computational models.
“Our aim is to provide industry partners with actionable information that improves operational performance at industrial scale,” said senior bioenergy scientist Damon Hartley.
Findings from the project revealed that mill speed and power had less influence on particle size than factors such as discharge screen size and moisture content.
Incorporating moisture into the models allowed the INL team to build a more comprehensive mathematical framework for predicting particle-size evolution.
A deep neural operator model demonstrated strong accuracy in capturing key variables and calibrating training data.
The team’s work shows that high-quality baseline testing significantly improves model performance, helping to reduce the need for costly large-scale physical trials.
“The more detailed data we have early in the process, the stronger the results when we eventually move to industrial-scale testing,” Xia said.
For companies developing biorefineries or upgrading existing facilities, the research offers a route to faster design cycles and reduced operational risk.
Many firms lack access to the advanced modelling tools and large-scale testing capabilities available at national laboratories.
“This is where we can make a difference,” Xia said. “Our role is to provide the expertise, hardware and software needed to help industry optimise processes and improve performance. Ultimately, we want to support wider commercial adoption of bioenergy technologies.”














