3D Scanning with iPhone: Exciting Potential, But Read This Before You Start
At 4/19/2024
In September, Shopify added the ability to create 3D scans using an iPhone to their iOS app. This was possible due to a new iOS 17 feature that promised to make 3D scans much easier for the rest of us.
Unfortunately, creating good 3D models turns out to more difficult than promised. In fact, some of my scans look hilarious. But eventually, I was able to create a realistic 3D model.
So, before you jump headfirst into the world of virtual mugs and AR lampshades, let’s dive into the reality of iOS 17’s 3D scanning. We’ll explore what it is, how it works, what works well, what to avoid, and how to maximize your chances of actually getting a decent scan.
Why 3D Models Matter?
Earlier this year, I wrote about the billions of dollars lost due to ecommerce returns and the environmental impact of returned products going into landfills. Cloudinary commissioned a study that found a “A third (30%) of respondents reported that they returned products they bought because they didn’t look as expected on the website.”
One of Cloudinary’s suggestions for how to decrease returns and cart abandonment was to use 3D models. Survey respondents said they “are more likely to buy if they have access to helpful media such as 360-degree spinsets (57%), 3D models (53%), and user-generated videos (50%).”
Shopify echoed this argument when it described three potential benefits of the new 3D scanning feature in their app:
- Save money and time
- Boost buyer confidence and conversion rates
- Minimizing returns and customer service demands
Outside of ecommerce, 3D models may also have increased importance in the future depending on the success of augmented reality and virtual reality headsets. I will be watching the upcoming release of Apple’s Vision Pro closely to see if it increases the number of people creating their own 3D models.
What is the iOS 17 3D Scanner?
This feature utilizes the iPhone’s camera and LiDAR sensor to create photogrammetric 3D models. Photogrammetric 3D models stitch multiple photographs of an object into a 3D representation. Apple provides an API for this feature which is what Shopify and others use to embed the technology in their own apps.
Scanning involves slowly circling the object while the app guides you, offering feedback and requesting additional scans as needed. The final stitching and processing takes several minutes.
Learn from my (Numerous) Mistakes
Avoid Reflective Surfaces
My first scan was of my Progressive Web Apps book coffee mug. The end result looked more like it belonged in the clay-crafted world of Wallace and Gromit than in my kitchen cupboard.
Beware of Thin or Transparent Objects
After my coffee mug failed, I looked around for something without a shiny surface. I found a papier-mâché piggy bank with matte surfaces that our kid had created when they were younger.
While I love this scan, the wings look more like cotton candy than the actual feathers in use. The feathers are too thin. The LiDAR scanner goes right through them.
Unfortunately, I made a similar mistake with my next object. We had recently purchased new table lamps. They seemed perfect for scanning.
They weren’t. Well, unless you were looking for a 3D model of a lamp that barely survived a fire.
Right Size
Objects need to be larger than 3 inches on all sides in order for the iPhone scanner to work. If you try to scan something smaller, the app will provide feedback and refuse to start the scan.
In addition, I found that if the object I tried to scan was too close in size to the size of whatever surface it is sitting on, that the scanner had a hard time distinguishing between the two.
That’s why the top of the table was included in the lamp model above. The lamp shade was nearly the same size as the end table the lamp was sitting on. My phone had a difficult time figuring out where the lamp ended and where the end table began.
Turntables Don’t Work
Many other 3D scanning solutions use an automated turntable so the camera and lighting are fixed. I tried to replicate this manually using a rotating server from our kitchen.
It didn’t work. The scanner requires you to move around the object, not the other way around.
Recommendations for Better Scans
- Choose the Right Object: Opt for solid, textured, non-shiny objects of appropriate size.
- Use a Tall, Round Table: A tall, round surface lets you circle the object without bending like a pretzel.
- Opt for Soft, Diffused Lighting: Avoid creating shadows by using multiple soft lights. You will cast a shadow unless light is coming from multiple directions.
Consider Other 3D Scanning Options
While iOS 17’s scanner democratizes 3D model creation, its limitations may hinder its utility, especially in ecommerce where image quality is paramount. Professional 3D models often involve higher-end equipment or can be derived from product design files.
Conclusion and Looking Ahead
Despite the challenges, I’m still excited about scanning in iOS 17. I love the fact that it is significantly easier to create 3D models. I had never created one before. Anyone can do it now.
I’m hopeful the quality will increase in the future. Most iPhones support higher resolutions than are used in the scans. Apple likely limits the resolution to speed model processing, but as far as I know, there’s no technical reason why the scans couldn’t use higher resolution photos.
That said, my optimism is tempered by the limitations. If you’re going to scan many objects or if your objects don’t fit what the iPhone can capture well, you may need to find some other 3D scanning solution.
Stay tuned for Part Two, where I’ll discuss integrating 3D models into web pages and the performance considerations involved.