LLM-3D Print: Large Language Models To Monitor and Control 3D Printing
Abstract
Industry 4.0 has revolutionized manufacturing by driving digitalization and shifting the paradigm toward additive manufacturing (AM). Fused Deposition Modeling (FDM), a key AM technology, enables the creation of highly customized, cost-effective products with minimal material waste through layer-by-layer extrusion, posing a significant challenge to traditional subtractive methods. However, the susceptibility of material extrusion techniques to errors often requires expert intervention to detect and mitigate defects that can severely compromise product quality. While automated error detection and machine learning models exist, their generalizability across diverse 3D printer setups, firmware, and sensors is limited, and deep learning methods require extensive labeled datasets, hindering scalability and adaptability. To address these challenges, we present a process monitoring and control framework that leverages pre-trained Large Language Models (LLMs) alongside 3D printers to detect and address printing defects. The LLM evaluates print quality by analyzing images captured after each layer or print segment, identifying failure modes and querying the printer for relevant parameters. It then generates and executes a corrective action plan. We validated the effectiveness of the proposed framework in identifying defects by comparing it against a control group of engineers with diverse AM expertise. Our evaluation demonstrated that LLM-based agents not only accurately identify common 3D printing errors, such as inconsistent extrusion, stringing, warping, and layer adhesion, but also effectively determine the parameters causing these failures and autonomously correct them without any need for human intervention.
Community
Excited to Share Our Latest Research on LLMs for 3D Printing!!!
Website: https://sites.google.com/andrew.cmu.edu/printerchat
We’re thrilled to introduce our new paper "LLM-3D Print: Large Language Models To Monitor and Control 3D Printing", which explores how multi-modal pre-trained Large Language Models, like ChatGPT-4o, can transform 3D printing by autonomously detecting and correcting defects in real-time without human intervention.
Our research introduces a novel multi-agent LLM workflow in which multiple LLMs collaborate to autonomously detect, identify, and correct defects in real time, surpassing traditional methods that rely on extensive labeled datasets and constant expert supervision.
This framework provides real-time insights into the printing process, enhancing transparency, reducing the need for destructive testing, and building trust in automated manufacturing technologies, all while decreasing material waste from failed prints.
We believe the multi-agent LLM framework could be a game-changer for making manufacturing smarter, more efficient, and more sustainable.
This is an automated message from the Librarian Bot. I found the following papers similar to this paper.
The following papers were recommended by the Semantic Scholar API
- Slice-100K: A Multimodal Dataset for Extrusion-based 3D Printing (2024)
- LLMI3D: Empowering LLM with 3D Perception from a Single 2D Image (2024)
Please give a thumbs up to this comment if you found it helpful!
If you want recommendations for any Paper on Hugging Face checkout this Space
You can directly ask Librarian Bot for paper recommendations by tagging it in a comment:
@librarian-bot
recommend
Models citing this paper 0
No model linking this paper
Datasets citing this paper 0
No dataset linking this paper
Spaces citing this paper 0
No Space linking this paper