![]() I don’t have any of these, so of course the quality will be affected.Īfter capturing the photos, I sent them to my Mac running macOS Monterey to then render the 3D model. ![]() In a real-life scenario, you should also have optimal lighting conditions, a tripod, and a mechanism to automatically rotate the object without changing its position. You need about 30 photos to create a 3D model, but Apple recommends using many more than that to get a high-quality result. With the example app provided by Apple, I had to capture multiple images of the object at different angles, so the API can then render the 3D object in 360 degrees. I used my iPhone 12 Pro Max running iOS 15 beta for this demonstration. Requirementsįirst, you need an iPhone or iPad with a dual-lens rear camera (and preferably a LiDAR scanner, although not required) to capture depth data. ![]() While there is still no app available in the App Store with this new feature, Apple provides some examples of how to compile an app using this new API, and of course I had to test it myself. With the Object Capture API, Apple says that this whole process of capturing and rendering 3D models will only take a few minutes now. But that changes with macOS Monterey and iOS 15. Usually, you would need advanced cameras to take 3D captures, and then render them all in a dedicated software. One of the new APIs is “Object Capture,” which will let users easily create 3D models of any object using the iPhone camera.Īpple has been pushing the adoption of AR technologies for a while now, but creating a 3D model may not be the easiest thing in the world for some people. Gifting everyone with a Mac or Macbook and an iPhone or iPad the ability to create their own 3D objects without third-party software or hardware could facilitate a huge leap forward in the proliferation of AR content.While macOS Monterey (also known as macOS 12) brings several new features for users, the update also comes with significant improvements for developers with brand-new APIs that enable new possibilities for third-party apps. Monterey will arrive as a free software update this fall, but for those on the bleeding edge, it is available today as a developer beta, with the public beta dropping next month.Īs important as ARKit has been in giving developers the ability to integrate AR into mobile apps, creating the 3D content underlying those experiences is another set of tasks altogether. The latter will use Object Capture to expand the products customers can preview via ARKit in their mobile app.Īlong with Object Capture, Apple is adding a new set of APIs via RealityKit 2 for "more realistic and complex AR experiences with greater visual, audio, and animation control, including custom render passes and dynamic shaders." It's unclear whether LiDAR via iPhone and iPad is required for Object Capture.Īrts and crafts marketplace Etsy and furniture retailer Wayfair are among the early adopters of the technology. Users can also preview the content via AR Quick Look of the model to confirm accuracy.Ĭontent generated via Object Capture can then be used in AR experiences created via Reality Composer or Xcode, as well as third-party platforms like Unity MARS and Maxon's Cinema 4D. Object Capture stitches together a series of photographs to create a 3D model of the subject. On Monday, at WWDC 2021, Apple introduced Object Capture, a photogrammetry tool built on the Swift programming language and coming to the Monterey edition of macOS via RealityKit 2, the next version of Apple's AR engine. And Apple may have just leveled the playing field for 3D content creation. As brands and content makers create more augmented reality experiences, the demand for tools to create 3D content grows in kind.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |