Recent advances in robotics and machine learning have enabled the automation of many real-world tasks, including various manufacturing and industrial processes. Among other applications, robotic and artificial intelligence (AI) systems have been successfully used to automate some steps in manufacturing clothes.
Researchers at Laurentian University in Canada recently set out to explore the possibility of fully automating the knitting of clothes. To do this, they developed a model to convert fabric images into comprehensive instructions that knitting robots could read and follow. Their model, outlined in a paper published in Electronics, was found to successfully realize patterns for the creation of single-yarn and multi-yarn knitted items of clothing.
“Our paper addresses the challenge of automating knitting by converting fabric images into machine-readable instructions,” Xingyu Zheng and Mengcheng Lau, co-authors of the paper, told Tech Xplore.
“Traditional methods require manual labeling, which is labor-intensive and limits scalability. Inspired by this gap, our goal was to develop a deep learning system that reverse-engineers knitted fabrics from images, enabling greater customization and scalability in textile manufacturing.”
The deep learning-based approach developed by Zheng, Lau and their colleagues tackles the problem of producing knitting instructions by completing two main steps. The first was named the “generation phase,” while the second was the “inference phase.”
“In the generation phase, an AI model processes real fabric images into clear synthetic representations and then interprets these synthetic images to predict simplified knitting instructions, known as front labels,” said Haoliang Sheng and Songpu Cai, co-authors of the paper. “In the inference phase, another model uses the front labels to deduce complete, machine-ready knitting instructions.”
-
Image illustrating the pipeline from the study. It begins with a real knitted fabric image, followed by the Generation Phase, where the “Refiner” and “Img2prog” modules produce a simplified front label. Then, in the Inference Phase, the “Residual Model” generates complete knitting instructions. Credit: Sheng et al.
-
The complete knitting instructions produced by the model. The final complete label includes both the visible front layer and the hidden back layer, ensuring the output is ready for direct use by knitting machines. Credit: Sheng et al.
-
More samples generated by the model. Credit: Sheng et al.
The new fabric pattern creation model introduced by the researchers has several valuable features and advantages. Most notably, it can produce both single and multi-yarn knitting patterns, accurately incorporate rare stitches, and be easily applied to new fabric styles.
The researchers tested their proposed system in a series of tests, using it to produce patterns for around 5,000 textile samples, which were set to be made of both natural and synthetic fabrics. They found that it performed remarkably well, generating accurate knitting instructions for most of these items.
“Our model attained an accuracy of over 97% in converting images into knitting instructions, significantly outperforming existing methods,” said Sheng and Cai.
“Our system also effectively handled the complexity of multi-colored yarns and rare stitch types, which were major limitations in earlier approaches. In terms of applications, our method enables fully automated textile production, reducing time and labor costs.”
The new model developed by Lau, Zheng, Sheng and Cai could soon be tested and improved further. Eventually, it could be deployed in real-world settings, potentially supporting the automated mass production of customized knitted clothes. When used with knitting robotic systems, the model could also allow designers to quickly create prototypes of their designs or test new patterns without manually creating machine-readable patterns.
“Moving forward, we plan to address dataset imbalances, particularly for rare stitches, through advanced augmentation techniques,” added Lau and Zheng.
“We also aim to incorporate color recognition to improve both structural and visual fidelity. Expanding the system to handle variable input and output sizes is another goal, allowing it to adapt dynamically to different fabrics. Finally, we intend to extend our pipeline to complex 3D knitted garments and explore cross-domain applications such as weaving and embroidery.”
More information:
Haoliang Sheng et al, Knitting Robots: A Deep Learning Approach for Reverse-Engineering Fabric Patterns, Electronics (2025). DOI: 10.3390/electronics14081605
© 2025 Science X Network
Citation:
System converts fabric images into complete machine-readable knitting instructions (2025, May 2)
retrieved 2 May 2025
from https://techxplore.com/news/2025-05-fabric-images-machine-readable.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.