TED Talk: Why I Built a Time Machine!

I spent the spring at TED this year, working on developing a VR piece about Coney Island’s Luna Park, and writing a talk about how technology has shaped entertainment. Cramming a TED talk into six minutes was no easy feat. The idea was to look back at the past as a lens to consider how technology during the first and second industrial revolutions defined the last century of entertainment, then to speculate how the dawn of today’s natural user interfaces will lead the way to new forms of entertainment that will extend into the next century. So, here it is at last, my TED talk!

The talk centers around a project I have been working on over the last several years- a historic reconstruction of Coney Island’s Luna Park as it appeared a century ago. The work done at TED this year resulted in a prototype VR experience allowing users to virtually walk through Luna Park as it existed in the milestone years of 1903, 1908 and 1914, and jump between these years as they explore. There’s tons more modeling to do to build out the experience, and after the talk I just had to put it aside for a while to reflect on why I am doing this completely obsessive project.

The current state of VR seems like an in-between technology to me. This isn’t where we are going, but rather a stop along the way.  I’m guessing some AR/VR device will eventually come along that mixes with just the right software experience to become the tipping point for mass adoption, but I think we’re still a ways away. In the meantime, it’s my hope to keep working on building out the Time Machine so that when technology finally catches up, it will be ready. There’s lots more work to be done modeling and then the exhaustive texturing that will really make Luna Park look like the glittering and glowing city of fire that it was.

A lot of my thinking about the piece left me thinking about how to turn this into a product that will interest people other than a very small group of Coney Island history buffs. It’s playful, but is it a game? Or maybe the rides get built out as virtual rides and that could justify its existence as a software product? I toyed with the idea of using it as a collection game where we can roam the environment exploring and collecting people from the Scan-A-Rama archive.  Then it dawned on me… this is literally a Time Machine, and there are tons of other places I would travel in time if I could. Where would you go if you could travel anywhere in time and space? That’s when I realized that this is–  the first proof of concept destination for a Time Machine platform.

I have a vision of the Time Machine as a set of camera controller objects that allow anyone who is passionate about a place to build their own time machine that can travel to ancient Egypt or whatever place they are passionate about. That’s my real “big idea” coming out of TED- that the Time Machine platform should be an open source library of template Unity files empowering anyone to create and share their explorations. This could be a great tool for architects, archaeologists, crime scene investigators, history and literary buffs– anyone interested in experiencing places that can no longer be physically visited, and which bear examination as they changed over time.

Time jumping is definitely a feature that is already out there as a software principle. Consider Undo/Redo, the DVR, or Google Street view, which already has about 15 years worth of data. What happens when Street View hits 50 years, or a century? Time travel is coming, it’s just going to be different than what H.G. Wells predicted. The Eye in the Sky episode of RadioLab also talks about another form of time travel that has definite ethical issues. In an increasingly digital world, I predict that temporal travel will become more pervasive, so where would you want to travel? I look forward to hearing from potential collaborators or anyone who is passionate about places they would want to travel to.

If you have a Windows VR setup with an HTC VIVE, try out the Time Machine prototype yourself. Special props to Matt Laverty of Atomic Veggies and developer Daniel Alhadeff for their help on the project!

Download Prototype for 64 bit Windows systems with HTC Vive.
Controls: Use the controller touchpad to move, press the small button above the touchpad to jump between the years of 1903,1908 and 1914 (For example, while looking at the entrance)

Time-Machine

Fred at TED!

It’s been hard to keep my mouth shut the last two months, but I am finally able to announce that I will be spending the next 3 months as a Resident at TED, working on a project to recreate turn of the 20th century Thompson & Dundy’s Luna Park in VR, with an accompanying TED talk in June.


This is the next step in the evolution of a project that I have been working on on and off for over 20 years, since I first went the NYU’s Interactive Telecommunications Program in the mid 1990’s. Back then, I created a series of Quicktime VR panoramas of Luna Park, and then a few years ago when 3D printing became widely accessible, I rekindled the project to produce a 3D printed model of the park that was exhibited in the Coney Island Museum. This upcoming phase will be focused on recreating the entirety of the park as a VR experience which will allow viewers to explore the park firsthand. It is my hope to share the models under a Creative Commons licence that encourages others to use virtual Luna Park and encourage its use as a virtual meeting place in the future.

There are several themes I am thinking about in regards to the TED talk I will be doing in June, but the general idea I am working with is about the history and legacy of technology as entertainment, specifically Luna Park’s role in defining everything we do for amusemnt today. Luna Park came out of the golden age of world fairs, a time when attractions in the 1897 World Columbian Exposition were effectively the physical manifestation of a memory palace containing all of man’s culture and knowledge. It was the real world embodiment of everything Google is for us today. The park was not only a showcase for fantastical, otherworldly architecture, it transported the melting pot of American immigrants to exotic locales around the world, even to the moon, anticipating what the future of entertainment would be like. Electricity, phonographs, motion pictures, infant incubators, simulation rides, they all began at Luna Park. I cannot wait to virtually resurrect it and share it with the world!

Screen Shot 2017-03-14 at 6.32.44 PM

A screencap of the Luna Park VR prototype, with cell shaded rendering style

 

CNC Carving Glitch Art

Greetings from the woods of Stanwood Washington where I am teaching in the remote paradise known as the Pilchuck Glass School. Session 2’s theme is Play and our class has been playing with methods of creating glass castings from digitally generated sculptures. 

As part of the class we are CNC carving graphite molds so I brought a bunch of graphite that I bought on eBay. I had previously carved blocks that I had bought from this guy with no problems so I was bamboozled when I started to have carving issues with my molds here. I spent the better part of the last two days wrestling with the XCarve, thinking the axis slipping was mechanical. Then I realized the carbide bit was totally shot and realized the problem. The CNC is finally back to carving the other/ softer graphite like it was butter. I finally carved this mold I’ve been trying to make, but only did the rough pass. I think I like the terraced stepping… What do you think!? 

In the meantime, please enjoy my latest work with graphite glitches….







Doctor Who’s Boneless Inspired By Fredini’s Zombie Army of 3D Print Fails!

This article How 3D printing glitches inspired these Doctor Who effects was recently brought to my attention. The article (and several others like it on the web) talks about how 3D printing “glitches” like my Zombie Army  print fails were the inspiration for the BBC show’s villains, the “Boneless”. To my great surprise and delight, the example 3D print fail being cited as influential on the Boneless was one of my earliest self portraits! As a Doctor Who Fan going all the way back to middle school, its quite an honor to have been of service to the Doctor.

DoctorWhoBoneless

“Boneless” was the name the Doctor gave to creatures from a two-dimensional universe. They were able to reduce both lifeforms and other three-dimensional objects to two-dimensional. After taking on three-dimensional forms themselves, they were also able to restore objects back from two-dimensional to three-dimensional.

frediniGlitch

This 3D Printed self portrait (I call it “Max Fredroom”) was one of my earliest print failures on my original Printrbot printer, and has been cited as influential on the visual style of Doctor Who’s Boneless.

Come Learn 3D Modeling, Scanning, Printing & Glass Casting with me at Pilchuck in June!

There’s only a few weeks until my class TaDDDaa!, which covers a range of digital processes for glass casting at the Pilchuck Glass School in Stanwood Washington and there’s still a couple slots available! Session 2 runs May 30-June 17. The course will combine a 3D modeling, scanning and 3D printing/CNC routing track with a physical track for glass casting with lost PLA kiln casting and hot casting into CNC carved graphite molds.

Apply today and join me in the woods for the time of your life!

digital-glass

This sequence shows the progression of a 3D scan being manipulated digitally, the 3D printed sculpture, and its final incarnation as a cast glass object.

Kahl,Fred-Image-10

This sequence shows a CNC routed mold that can be used for hot glass casting.

Putting the NefertitiHack Scan-Dal to Bed

This article is an apology to Nora Al-Badri and Jan Nikolai Nelles’. I hope my questioning of your methods doesn’t land you in hot water. Last week, my post There’s Something Fishy About the Other Nerfertiti really blew the lid off a debate that has recently had a lot of chatter in online 3D scanning, printing and art circles. My post debunked the possibility that your art project The Other Nefertiti was scanned using an Xbox Kinect sensor. Immediately after my posting, the internet seemingly exploded with commentary about how the “Nefertitihack” had to be a hoax. At the end of the day, I don’t think it matters whether you scanned the artifact with a Kinect or procured a file from other sources because what you have created is a wonderful dialogue about the provenance of antiquities, ownership, copying and a strong argument for openness and sharing of data. What’s great is that this priceless antiquity is now within reach of anyone who cares to experience it and people are interacting with Nefertiti in unique and personal ways they never have been able to before.

The repercussions of my post began with Hacker News picking up on my post, which produced a ton of traffic and some great comments. The Nefertiti Hack: Digital Repatriation or Theft? was the first press I’m aware of that began to seriously consider that things with the story didn’t add up. Hyperallergic’s Claire Voon then produced some of the most insightful commentary Could the Nefertiti Scan Be a Hoax — and Does that Matter?, and it all culminated with the Times’ Charly Wilder’s follow up  Nefertiti 3-D Scanning Project in Germany Raises Doubts (She was aghast that none of the three experts she had spoken to for her original article had picked up on the possibility that the data was too good for their capture method).

So we may never know the truth about the true origin of the file. I tend to think that it is somehow derived from an official museum scan, which was somehow copied in the creation of their 3D printed edition). However, the real conclusion I’ve come to is that it doesn’t matter. What matters is something the internet has taught us all along. From Wikileaks to Nefertiti, information wants to be free. Already, a quick google search will turn up a range of Nefertiti derivative works ranging from interactive art to Lego Minifig  and Pez heads, voronoi (swiss cheese) versions, Nefertiti vases and many more. What will be done in the future with her data, only time will tell.

Some Nefertiti inspired works that are already making the rounds. The best is yet to come.


Perhaps the best argument for museums being open with their data is the talk below that was just posted by Cosmo Wenman. Cosmo is an artist who has been active in scanning antiquities from museum collections for several years now. He has scanned a number of famous objects and casts made from objects, including the Venus de Milo and Winged Victory of Samothrace. The talk likens these 3D scans to the 19th century tradition of making plaster casts from classical sculptures and makes a strong case by showing several examples of how people have used the data he shared. This is a position that has been shared by many museums, including the Media Lab of the Metropolitan Museum of Art here in NYC, and I sincerely hope this trend will continue. The promise of the internet is that all of man’s knowledge will be at our fingertips. The Neues Museum would be wise to take note.

There’s Something Fishy About The Other Nefertiti

Screen Shot 2016-03-03 at 11.08.30 PMThe New York Times and countless other news outlets picked up on a story that’s been making the rounds of the 3D printing community the last few weeks; Swiping a Priceless Antiquity … With a Scanner and a 3-D Printer tells the story of two artists; Nora Al-Badri and Jan Nikolai Nelles, who apparently succeeded in making a massive digital art heist when they surreptitiously scanned Nefertiti’s bust in Berlin’s Neues Museum. It’s great art commentary with many levels; It talks to issues around digital copy vs. originals, who is allowed to own these copies, and going deeper, questions the museum’s right to own the statue which rightfully should belong to Egypt. By releasing the files for The Other Nefertiti free online as a torrent, the artists have initiated a huge debate with many layers. The 3D Printing community is not without its debate about this work either. The video the artists share that shows them sneaking a kinect scan from beneath Ms. Badri’s scarf raises a lot of questions as to whether the artists even scanned the artifact as they claim.

The video shows the two using a Kinect Xbox controller to capture Nefertiti, and while I have no doubt the artists may have done the Kinect stunt, there is simply no way the scan being distributed was made with a Kinect. Simply put, the scan being distributed which is made of more than 2 million triangles is far too detailed to have been made with that hardware. Even if the device was “hacked” as news reports state, this quality is not achievable with this method. Problems in the story include how the unit was powered (Kinect requires a wall outlet), but even if they carried some kind of battery, Nefertiti is under glass which also causes errors in this method of scanning. Other questions include where was the connected laptop hidden? Then, the video shows Ms. Badri repeatedly covering and recovering the Kinect as she circles Nefertiti. A normal scan would require uninterrupted line of sight of the statue as the scan is happening. In order to get coverage of the top of the headdress, the scanner would have to be held high above the statue, and not at waist level as the video indicates. Maybe multiple scans might have been made from various vantage points and meshed together later, but truthfully the scan is way too high fidelity for the way the Kinect works. The device sprays a grid of points of infrared light over the subject and each point is measured as a distance measurement while the computer software does the math to turn that into a 3D model. Its a fast way to generate a 3d model, but with accuracy that is at best only within a few millimeters.

Screen Shot 2016-03-06 at 2.23.58 PM.png

I have made over 10,000 Kinect scans. Here we see what is typical of a “good” quality Kinect image capture. Tolerances with this type of scanning are at best within a few millimeters of accuracy. Nowhere near the Sub-millimeter quality of Nefertiti’s scan.

Screen Shot 2016-03-03 at 11.11.08 PM

Nefertiti’s scan is much more accurate that what is possible with a Kinect

So if it wasn’t made with a Kinect, How was this scan created, and why lie about it? When confronted about how the scan was made on the 3D in Review Podcast, Mr. Nelles is vague in his answers and claimed that he and Ms. Badri knew nothing about the device and that some hacker types had set up the hardware for them. The “hackers” then took the data and created the model for them. When asked about the hacker’s technique, Mr. Nelles stated that the hackers had left for New Zealand and were unable to be contacted.

One theory is that the scan is actually generated by Photogrammetry, a technique of capturing images of the sculpture from a variety of angles. The images are then fed into software such as Agisoft Photoscan that analyzes all the images for common points, and generates a 3D model of the subject. Paul Docherty is a researcher who has extensively used photogrammetry to reconstruct historic artifacts and sites, including a model of Nefertiti’s bust using available imagery he gathered online. He catalogued the process in his article 3D Modelling the Bust of Queen Nefertiti, and also spoke on the 3D in Review podcast about his efforts. Mr. Docherty has since gone on to question the Nefertiti Hack scan in his article Nefertiti Hack – Questions regarding the 3D scan of the bust of Nefertiti, in which he agrees that there is no way this scan was captured with a Kinect. So, its possible that the scan could have been made using a series of 45-120 high res images covertly gathered with a cellphone, but if that’s the way it was done, why show the Kinect in the video?

Screen Shot 2016-03-03 at 11.07.17 PM

The last possibility and reigning theory is that Ms. Badri and Mr. Nelles elusive hacker partners are literally real hackers who stole a copy of the high resolution scan from the Museum’s servers. A high resolution scan must exist as a high res 3D printed replica is already available for sale online. Museum officials have dismissed the Other Nefertiti model as “of minor quality”, but that’s not what we are seeing in this highly detailed scan. Perhaps the file was obtained from someone involved in printing the reproduction, or it was a scan made of the reproduction? Indeed, the common belief in online 3D Printing community chatter is that the Kinect “story” is a fabrication to hide the fact that the model was actually stolen data from a commercial high quality scan. If the artists were behind a server hack, the legal ramifications for them are much more serious than scanning the object, which has few, if any legal precedents.

What do you think? Will we ever know? Nefertiti’s data liberation has certainly sparked some controversy and I look forward to seeing what the community creates from it, as well as what truths come out as the story unfolds!