We are excited to bring Transform 2022 back in-person July 19 and virtually July 20 - 28. Join AI and data leaders for insightful talks and exciting networking opportunities. Register today!
Who knew modern knitting machines could be so complicated? The type of whole-garment method that’s used to produce socks, gloves, sportswear, shoes, car seats, and other textiles requires knowledge of the low-level language used to create knitting machine routines. It’s expert-level stuff, and the stakes are high — even minor mistakes can ruin an entire garment.
That’s why researchers at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) investigated new approaches to streamline the knitting process. The first of their two proposed systems, InverseKnit, translates photos of knitted patterns into knitting instructions. The second — CADKnit — employs a combination of two-dimensional images, computer-aided design software, and photo design techniques to let users knit design templates.
Getting InverseKnit up and running required compiling a data set of 17 different knitting instructions, along with matching images of each pattern. The pattern instructions were extracted from knitwear manufacturer Shima Seiki’s KnitPaint software, while the photos were produced both by knitting a subset of 1,044 real-world patches and rendering the patches using KnitPaint’s pattern preview feature. (The coauthors note that each knitted pattern effectively provides two full opposite patterns, doubling the size of the real knitted data set to 2,088 samples in total.)
The team trained an AI algorithm on the corpus to “teach” it to interpret the two-dimensional knitting instructions from images, such that it could generate machine-readable instructions given a picture of an object. The instructions triggered one of several of an attached knitting machine’s basic operations, like a knit (which pulls a loop of yarn through all current loops), a tuck (which stacks a new loop onto a needle), a miss (which skips a needle), or a transfer (which moves the needle’s content to the other bed).
The team reports that in tests, InverseKnit managed to produce accurate instructions 94% of the time. The current iteration only works with a small sample size (and only with acrylic yarn) and doesn’t explicitly model the pattern scale, nor does it impose hard constraints to prevent instructions from being violated. But they hope to expand the sample pool and materials in future work.
“Current state-of-the-art computer vision techniques are data-hungry, and they need many examples to model the world effectively,” said Prof. Jim McCann, who works on similar projects at the unrelated Carnegie Mellon Textiles Lab. “With InverseKnit, the team collected an immense dataset of knit samples that, for the first time, enables modern computer vision techniques to be used to recognize and parse knitting patterns.”
Knitting software tools that create three-dimensional meshes aren’t as much of a rarity as they used to be. Still, they’re relatively complex and tend to introduce distortions, impeding the design process.
By contrast, CADKnit was engineered with casual users in mind. It lets them write their own reusable programs or manipulate the corresponding shape and patterns visually, and inspect the low-level code that’s automatically generated for the current layout through a view. An inspection panel allows them to manually edit input parameters or use a mouse to directly extend shapes from their boundaries on the bed layout. And users can write their own program, use preexisting ones, or interactively draw pattern layers that can be exported or resampled for different shapes and sizes.
CADKnit even spits out warnings in the event an “undesirable” knitting structure makes it through the drafting stage.
“As far as machines and knitting go, this type of system could change accessibility for people looking to be the designers of their own items,'” said CSAIL PhD student and lead author Alexandre Kaspar. “We want to let casual users get access to machines without needed programming expertise, so they can reap the benefits of customization by making use of machine learning for design and manufacturing.”
To validate CDKnit’s design, the researchers recruited non-expert users to create patterns for socks, hats, scarfs, sweatpants, and yoked shirts and adjust their sizes and shapes. Most reported in post-session surveys that they found it easy to manipulate and customize the garments, and to fabricate knitted samples. But there’s work left to be done.
The researchers found that apparel that can be connected in various ways, like sweaters, didn’t work well with CADKnit because it lacks a means of describing the whole design space. Furthermore, they note it can only use one yarn for a shape and is limited to relatively basic patterns, which they intend to rectify by introducing a stack of yarn at each stitch and hierarchical data structures that only incorporate necessary stitches.
“The impact of 3D knitting has the potential to be even bigger than that of 3D printing. Right now, design tools are holding the technology back, which is why this research is so important to the future,” added McCann.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.