Doctor Who’s Boneless Inspired By Fredini’s Zombie Army of 3D Print Fails!

This article How 3D printing glitches inspired these Doctor Who effects was recently brought to my attention. The article (and several others like it on the web) talks about how 3D printing “glitches” like my Zombie Army  print fails were the inspiration for the BBC show’s villains, the “Boneless”. To my great surprise and delight, the example 3D print fail being cited as influential on the Boneless was one of my earliest self portraits! As a Doctor Who Fan going all the way back to middle school, its quite an honor to have been of service to the Doctor.

DoctorWhoBoneless

“Boneless” was the name the Doctor gave to creatures from a two-dimensional universe. They were able to reduce both lifeforms and other three-dimensional objects to two-dimensional. After taking on three-dimensional forms themselves, they were also able to restore objects back from two-dimensional to three-dimensional.

frediniGlitch

This 3D Printed self portrait (I call it “Max Fredroom”) was one of my earliest print failures on my original Printrbot printer, and has been cited as influential on the visual style of Doctor Who’s Boneless.

Come Learn 3D Modeling, Scanning, Printing & Glass Casting with me at Pilchuck in June!

There’s only a few weeks until my class TaDDDaa!, which covers a range of digital processes for glass casting at the Pilchuck Glass School in Stanwood Washington and there’s still a couple slots available! Session 2 runs May 30-June 17. The course will combine a 3D modeling, scanning and 3D printing/CNC routing track with a physical track for glass casting with lost PLA kiln casting and hot casting into CNC carved graphite molds.

Apply today and join me in the woods for the time of your life!

digital-glass

This sequence shows the progression of a 3D scan being manipulated digitally, the 3D printed sculpture, and its final incarnation as a cast glass object.

Kahl,Fred-Image-10

This sequence shows a CNC routed mold that can be used for hot glass casting.

There’s Something Fishy About The Other Nefertiti

Screen Shot 2016-03-03 at 11.08.30 PMThe New York Times and countless other news outlets picked up on a story that’s been making the rounds of the 3D printing community the last few weeks; Swiping a Priceless Antiquity … With a Scanner and a 3-D Printer tells the story of two artists; Nora Al-Badri and Jan Nikolai Nelles, who apparently succeeded in making a massive digital art heist when they surreptitiously scanned Nefertiti’s bust in Berlin’s Neues Museum. It’s great art commentary with many levels; It talks to issues around digital copy vs. originals, who is allowed to own these copies, and going deeper, questions the museum’s right to own the statue which rightfully should belong to Egypt. By releasing the files for The Other Nefertiti free online as a torrent, the artists have initiated a huge debate with many layers. The 3D Printing community is not without its debate about this work either. The video the artists share that shows them sneaking a kinect scan from beneath Ms. Badri’s scarf raises a lot of questions as to whether the artists even scanned the artifact as they claim.

The video shows the two using a Kinect Xbox controller to capture Nefertiti, and while I have no doubt the artists may have done the Kinect stunt, there is simply no way the scan being distributed was made with a Kinect. Simply put, the scan being distributed which is made of more than 2 million triangles is far too detailed to have been made with that hardware. Even if the device was “hacked” as news reports state, this quality is not achievable with this method. Problems in the story include how the unit was powered (Kinect requires a wall outlet), but even if they carried some kind of battery, Nefertiti is under glass which also causes errors in this method of scanning. Other questions include where was the connected laptop hidden? Then, the video shows Ms. Badri repeatedly covering and recovering the Kinect as she circles Nefertiti. A normal scan would require uninterrupted line of sight of the statue as the scan is happening. In order to get coverage of the top of the headdress, the scanner would have to be held high above the statue, and not at waist level as the video indicates. Maybe multiple scans might have been made from various vantage points and meshed together later, but truthfully the scan is way too high fidelity for the way the Kinect works. The device sprays a grid of points of infrared light over the subject and each point is measured as a distance measurement while the computer software does the math to turn that into a 3D model. Its a fast way to generate a 3d model, but with accuracy that is at best only within a few millimeters.

Screen Shot 2016-03-06 at 2.23.58 PM.png

I have made over 10,000 Kinect scans. Here we see what is typical of a “good” quality Kinect image capture. Tolerances with this type of scanning are at best within a few millimeters of accuracy. Nowhere near the Sub-millimeter quality of Nefertiti’s scan.

Screen Shot 2016-03-03 at 11.11.08 PM

Nefertiti’s scan is much more accurate that what is possible with a Kinect

So if it wasn’t made with a Kinect, How was this scan created, and why lie about it? When confronted about how the scan was made on the 3D in Review Podcast, Mr. Nelles is vague in his answers and claimed that he and Ms. Badri knew nothing about the device and that some hacker types had set up the hardware for them. The “hackers” then took the data and created the model for them. When asked about the hacker’s technique, Mr. Nelles stated that the hackers had left for New Zealand and were unable to be contacted.

One theory is that the scan is actually generated by Photogrammetry, a technique of capturing images of the sculpture from a variety of angles. The images are then fed into software such as Agisoft Photoscan that analyzes all the images for common points, and generates a 3D model of the subject. Paul Docherty is a researcher who has extensively used photogrammetry to reconstruct historic artifacts and sites, including a model of Nefertiti’s bust using available imagery he gathered online. He catalogued the process in his article 3D Modelling the Bust of Queen Nefertiti, and also spoke on the 3D in Review podcast about his efforts. Mr. Docherty has since gone on to question the Nefertiti Hack scan in his article Nefertiti Hack – Questions regarding the 3D scan of the bust of Nefertiti, in which he agrees that there is no way this scan was captured with a Kinect. So, its possible that the scan could have been made using a series of 45-120 high res images covertly gathered with a cellphone, but if that’s the way it was done, why show the Kinect in the video?

Screen Shot 2016-03-03 at 11.07.17 PM

The last possibility and reigning theory is that Ms. Badri and Mr. Nelles elusive hacker partners are literally real hackers who stole a copy of the high resolution scan from the Museum’s servers. A high resolution scan must exist as a high res 3D printed replica is already available for sale online. Museum officials have dismissed the Other Nefertiti model as “of minor quality”, but that’s not what we are seeing in this highly detailed scan. Perhaps the file was obtained from someone involved in printing the reproduction, or it was a scan made of the reproduction? Indeed, the common belief in online 3D Printing community chatter is that the Kinect “story” is a fabrication to hide the fact that the model was actually stolen data from a commercial high quality scan. If the artists were behind a server hack, the legal ramifications for them are much more serious than scanning the object, which has few, if any legal precedents.

What do you think? Will we ever know? Nefertiti’s data liberation has certainly sparked some controversy and I look forward to seeing what the community creates from it, as well as what truths come out as the story unfolds!

 

GlassTItlecard

Explorations of Processes for Digitally Created Glass Castings

I spent much of 2015 taking a year long flexible fellowship at Wheaton Arts’ Creative Glass Center of America developing ways to cast glass from computer generated sculptural forms and wanted to take some time to share these learnings. I worked extensively with glass many years ago but now create most of my art with 3D scanning and printing. The Wheaton Arts’ fellowship was a unique opportunity to bring these two practices back together.

GlassCast-01

Wheaton Arts is home to the Creative Glass Center of America in Millville, NJ, where I had a fellowship this past year to conduct experiments and create a workflow for casting glass from computer designed objects.

My work in the last few years has used a combination of 3D modeling techniques. I usually begin with 3D scanning,  primarily structured light scanning with Primesense/Kinect style devices and occasionally photogrammetry for non human subjects. Other non-organic forms are just modeled directly in the computer using my software of choice Zbrush. Next I will digitally manipulate and sculpt the 3D scans in the computer. Finally the work is output as a 3D print. My investigations this year took these techniques further, so that these digital sculpts were then realized as cast glass forms. I tried a few variations of lost PLA casting, as well as CNC milling graphite to make reusable molds. Enjoy.

Lost PLA Kiln Casting

The first technique I chose to explore is what people are calling “Lost PLA”, basically an evolution of the traditional lost wax kiln casting technique. Starting with a 3D printed positive of the form I want to cast in glass, I created a plaster/silica mold around my 3D print.

The actual recipe for the mold by weight was:

  • 16  parts water
  • 6 parts Hydroperm
  • 6 parts Plaster
  • 6 parts Silica (or olivine sand)
  • 1  cup 3/4″ fiberglass strand

This could be done with just a 50/50% plaster-silica mix, but as I understand it the Hydroperm foams and creates air pockets in the molds to make them lighter. The fiberglass strand helps strengthen the mold and helps wick out the moisture so the molds dry more efficiently.

I began by plugging any holes in the surface of the 3D prints with microcrystalline wax and waxing the prints down to a table. I then just cut strips of tar paper and hot glued them down to form a wall around the print, leaving room for about 3″ of mold thickness. I mixed a small initial coat of mix with no fiberglass strand to use as a splash coat over the object, then mixed subsequent buckets of mix to fill the molds completely. After the mold was filled, I let it set before tipping the molds on their sides to dry (having fans helped this).

IMG_9404

3D Print affixed to the table with wax and surrounded by a cylinder of tar paper

IMG_9406

3D prints and tar paper prepped for casting molds

IMG_9410

Splash coat on the 3D print

 

IMG_9414

Mold ingredients being dry mixed before the water is added

IMG_9422

Completed molds drying

After a minimum of one day drying (and preferably more like a week), the molds were ready to be loaded into the ovens. The PLA was still on the inside and needed to be melted out. With lost wax casting, the general practice is to steam the wax out of a mold. I tried this with the 3D printed PLA and barely got it to move at all. Steaming out PLA is not an option so it has to be burned out in an oven.

Even though PLA is a biodegradable corn starch, the burnout is smoky and not good to be around so it had to be timed to happen overnight when the studio was empty. I would begin by soaking the oven at 300˚ for about three hours and then pushing it upwards at about 100˚/hour. At about 450˚ I would go in (wearing gloves, glasses and a respirator) and use pliers to pull out some big chunks of plastic as it started melting. I had to be careful not to damage the mold in doing so. At about 700˚, I would go in with a stainless steel turkey baster and suck out as much molten plastic as possible. The oven would then go to 1000˚ for an hour and be fully burned out. Because burn out in an oven is pretty nasty smelling at its peak and I can’t really recommend it as a best practice. However, I just became aware of Moldlay– a 3D printing filament designed for lost wax casting. It’s expensive but I would like to check it out.

IMG_9502

Mold mid burnout

IMG_9527

Molds at 1000˚F after burnout

From there I tried two basic approaches to casting into the plaster molds. The first technique involved putting molten glass from a furnace directly into the molds. The second involved cooling the molds back down, packing them with chunks of crystal glass, then firing them to melting point. Each techniques has its strengths and weaknesses.

Lost PLA Kiln Casting Technique 1- “Hot Glass Lacrosse Casting”

Since buying crystal to kiln cast with is very expensive, I was trying to be more cost effective by using the readily available furnace glass to cast with. Basically, after the mold was burned out I would soak it at 1000˚ for several hours to burn out any chemical water, and then we would ladle glass directly into the molds. I found that even after I soaked the molds 10-12 hours at 1000˚, the chemical water in the plaster would still cause the glass to bubble up as we poured the glass in. We then resorted to a technique we called Lacrosse Casting. I would gather a ladle of glass, then dump the ladle into a second ladle someone else was holding. They would rock the ladle side to side so that the molten glass skinned up on the outside a little. They would then dump that back into my ladle and I would go to the oven and gingerly drop this “hot tamale” of glass into the mold. The center of the glass was still quite hot but the skinned up exterior was less likely to bubble. Unfortunately the molds are quite fragile and this can cause damage if there are a lot of delicate details to the mold. After the mold was filled, the oven would be sent up to about 1500-1600˚ to make the flatten out and flow into the mold. As soon as the glass flattened, the oven was crashed back down to under 1000˚ and an annealing cycle began.

Screen Shot 2016-01-26 at 12.11.50 AM.png

Passing off ladles full of molten glass in order to cool the exterior of the glass before dropping it in the fragile molds

Screen Shot 2016-01-26 at 12.13.09 AM

The 2300˚ “lacrosse” pass

Screen Shot 2016-01-25 at 11.52.48 PM

Dropping glass into the mold

IMG_9565

Crashing the oven after it reaches 1600˚F and the glass has flattened out

Issues with this technique

  • This process takes a lot of time! I was casting forms that took 2+ days to print, then there was mold making, casting and annealing.
  • The plaster molds are extremely fragile and can easily be damaged by the glass as it is dropped in the mold. This results with imperfect castings that often have bits of plaster encased in the glass.
  • If the ladles are not very clean, the glass will often have veiling from the ladle surface. Bubbles are also often introduced resulting in a very bubbly glass. In my case I liked the underwater look this gave.
  • Devitrification is a crystalizing of glass that happens at approximately 1200 degrees, making the surface of the glass fog up. Soda lime glass (Spruce Pine batch) is particularly vulnerable to this. In my case, if there was not enough radiant heat above the top of the mold and it took too long for the oven to heat up for the glass to level out in the mold, the surface would fog up.

Sample Castings (Unfinished work)

Kahl,Fred-Image-08

Death (work in progress) 14″x 14″x 6″

IMG_3683

Creation (work in progress) 14″ x 14″ x 5″

IMG_3669

Untitled work in progress 14″ x 14″ x 4″

FullSizeRender (22)

The Gates of Heaven 24″x 19″x 5″

 

Lost PLA Kiln Casting Technique 2- Kiln Casting

The issues with molds being damaged by hot glass and devitrification lead me to acquire some crystal to kiln cast with. Casting crystal is expensive and usually formulated to not devitrify so in general this technique delivers more optically pure castings. I tried two glass formulas for kiln casting; Uroboros Glass’ system 96  and Schott optical crystal.

After the mold is burned out, I would let the oven slowly return to room temperature so I could carefully vacuum it out and pack it with chunks of glass (My glass came in large tiles so I cleaned the surface with alcohol and then used a torch to shatter them.

Bang! #glass @wheatonarts

A post shared by fredini (@fredini) on

The small chunks could then be loaded in the mold, and the oven slowly brought back up to about 1550˚ until they melted in completely and the worst of the bubbles came to the surface. At that point the oven was crashed back down to under 1000˚ and the glass is annealed (slowly brought to room temperature).

IMG_2775

Loading chunks of crystal into the mold

IMG_2776

Close up of mold and crystal chunks. I like how the mold picked up the layer lines from the 3D print.

IMG_2779

The oven is ready to be heated back up

IMG_2783

At about 1550-1600˚F, the glass is molten and flowing into the mold

IMG_2855

Its a delicate balance how long to keep the mold hot. You want to get the bubbles to rise to the surface, but also want to stop before the glass begins to devitrify and fog on the surface.

  • This technique definitely yielded the best casting results for Lost PLA
  • More expensive- both in the time consuming process, oven time and  most significantly the cost of the casting crystal
  • Devitrification can still be an issue, depending upon the glass used
  • The final casting still requires extensive grinding, polishing and finishing work
Kahl,Fred-Image-07

Angel/Mermaid (work in progress) 14″ x 14″ x 6″

Kahl,Fred-Image-09

Funny Face (work in progress) 14″ x 14″x 6″

Hot Glass Casting into CNC milled graphite molds

I quickly realized that the lost PLA technique was time consuming and disconnected from all the excitement and spontaneity that I associate with hot glass work. Lost PLA castings also required extensive work divesting from the mold, then grinding and polishing. I had long wanted to experiment with CNC carving as opposed to 3D printing, and set out to experiment with milling graphite to create reusable molds as a more cost effective approach for casting glass with.

Kahl,Fred-Image-10

CNC milled graphite molds for glass casting; rough pass carving with 1/4″ endmill, finished mold cut with 1/8″ ball mill, molten glass in the mold and annealed, cooled glass casting

I decided to purchase the Inventables XCarve  because it is an open source CNC machine, and I wanted to support Inventables great work in making CNC software more user friendly. I opted for the 1000mm version with the heavy duty Nema 23  motors and Dewalt 611 spindle which has enough power to even mill aluminum. I had a lot of trepidation about building a kit as some of the 3D printer kits I had built in the past were not well documented, but Inventables documentation was excellent and it worked pretty well right off the bat. A few support calls and posts on the message boards got me through the few small hiccoughs that I did encounter.

However, I was concerned about milling graphite as the dust is electrically conductive. If that dust got all over as I was using the machine, not only would it make a mess, it could also fry the Xcarve Arduino controller, and even the laptop driving the setup. I had no choice but to rig up a robust dust collection system. I ended up buying a dust shoe from KentCNC to mount around the spindle (Yes, I could have made my own but was running out of time at Wheaton by then). This then attached to a Dust Deputy cyclonic dust collector and a shop vacuum. The result was a powerful dust collection system that captures nearly all the graphite coming off the spindle as it cuts, and 95% of it ends up in my new favorite tool, the Dust Deputy.

IMG_1637

The newly assembled X-Carve, still in need of some wire management

IMG_2686

Unlike 3D printing, CNC milling has the machine doing multiple passes. Shown here is the rough pass, done with a 1/4″ endmill.

IMG_2692

The finished mold, cut with a 1/8″ ball mill

IMG_2805

The big test… Trying out the mold for the first time!

IMG_2794

Torching the glass as it sets up in the mold

OLD-Kahl,Fred-Image-10

It looks amazing!

FullSizeRender (21)

The final result- a cast glass tile from a mold that will stand up to hundreds of castings!

Coming out of the last year of experiments, I’m most excited about this process as it has a lot of potential for small run computer designed glass objects and custom tiles for architectural use. There are some design restrictions in that molds cannot have undercuts and the Xcarve can only cut about 2 1/2″ at the deepest, but because these molds can be quickly and comparatively cheaply generated and used to create cost effective editions, I think this approach has a lot of promise. I look forward to continuing to experiment with these techniques as I share them with my class this June at the Pilchuck Glass School.

 

TaDDDaa! A Class in Digital Processes for Glass Casting at Pilchuck Glass School

Kahl,Fred-Image-10I’m please to announce that I will be teaching a class at the Pilchuck Glass School in June of 2016. The class will cover a lot of the digital to physical techniques that I have been working with over the last few years, particularly this past year with my fellowship at Wheaton Arts Creative GLass Center of America (Stay tuned, I will be publishing my finding soon).

The intensive 3 week class will take place in June of 2016. Pilchuck’s session 2 this year has a theme of Play, and we certainly will be playing with ways of bringing the 21st century to glassmaking’s 19th century traditions.

TADDDAA! 
May 31- June 17
3-D Modeling & Printing, Lost PLA, CNC, Kilncasting, Hot Casting

Digital sculpting and 3-D printing tools allow artists to visualize prototypes, manipulate scale, and replicate with precision. This course will introduce an assortment of tools including Zbrush (an organic sculpting software), 3-D printers, and 3-D scanning methods of photogrammetry and structured light. Students will learn to scan, manipulate, and print objects and ultimately kilncast and hot cast them in glass. This class is for glass artists who wish to explore digital fabrication and 3-D artists who wish to explore glass.
http://pilchuck.com/summer_program/courses/session2.aspx

The class will cover:

  • 3D Modeling with Zbrush
  • 3D Scanning with structured light and photogrammetry
  • 3D Printing
  • Moldmaking
  • Glass kiln casting with lost PLA
  • CNC milling of graphite molds for hot glass casting

It’s going to be an awesome class and I’m looking for two teaching assistants. Applications for TA’s are due February 3. I would like to find one TA who is well versed in 3D modeling and 3D printing (its a + is you own or have built a 3D Printer), and another who is familiar with glass kiln casting. Please help spread the word!

Scan-A-Rama 3D Portrait Studio 2015 Year in Review

Before the year ends I wanted to write about some of the events I was able to be a part of this year with the Scan-A-Rama 3D Portrait Studio. I had a spectacular year of making 3D portraits for thousands of people at a slew of events that included talks as well as scanning events at the Innovation Loft,  Midwest Reprap Festival, Newark New Jersey, Westport ConneticutBay Area and New York Maker Faires, at Pepsico’s Executive World Summit, and Theatre Bizarre. I also spent the year on an artist fellowship at Wheaton Arts’ Creative Glass Center of America in which I innovated and explored methods of glass casting from digitally designed objects. Stay tuned for more on this- I will be posting the findings from this fellowship in January (I am just wrapping up there now). Finally, I went out to Seattle and set up a bot lab at the Pilchuck Glass School, where I will be teaching a class TaDDDaa!! in June of 2016. The class will cover techniques of glass casting from a digital workflow that includes 3D sculpting, scanning, printing, and CNC carving. If this list isn’t impressive enough for you, I also had two amazing corporate gigs in November that I’ve been meaning to write up- The first being at Google of their internal user experience conference; Google UXU, and the second for Pfizer at their Excelerate innovation conference!

IMG_1712

Google’s UXU is an internal conference for the company’s entire UX ladder to meet up, compare notes on best practices, hone skills and get inspired to create some of the world’s best digital products. I was honored to be invited to come out to the Googleplex and deliver the conference’s opening keynote address, one expanded from my talk at Bay Area Maker Faire about the History of Technology as Entertainment that included my work on recreating Luna Park with 3D printing, and ended with some conjectures about 3D printing and what will happen when hot rod culture gets a hold of self driving cars. Was the talk well received? Here’s what one attendee said:

Right to left: Speaker Corey Pressman who gave a great talk about poetry for robots , Speaker Stella Grizont who talked about the science of happiness, conference organizer Thea Kluge/Carter and myself

Right to left: Speaker Corey Pressman who gave a great talk about poetry for robots , Speaker Stella Grizont who talked about the science of happiness, conference organizer Thea Kluge/Carter and myself

The second day of the conference I ran the Scan-A-Rama 3D Portrait Studio at their mixer and created a number of great portraits for people

The second day of the conference I ran the Scan-A-Rama 3D Portrait Studio at their mixer and created a number of great portraits (Click to view larger).

November was busy! I had barely returned from the Googleplex when Pfizer’s Excelerate conference began and I spent two days making 3D portraits of some of the movers and shakers of one of the world’s largest pharmaceutical companies. I even made a portrait of the CEO, Ian Read!

Here I am with Pfizer CEO Ian Read!

Here I am with Pfizer CEO Ian Read!

Ian Read's raw scan

Ian Read’s raw scan

The cleaned up scan ready for printing

The cleaned up scan ready for printing

The final print...in Pfizer Blue!

The final print…in Pfizer Blue!

2015 was an amazing year for me. I can’t say I got rich, but as the first year out on my own doing what I love without working for someone else, I think I did pretty well for myself.  Just writing it up now, I realize that Fredini Enterprises has done some great work… and there’s a lot more where that came from! I have a lot of exciting things in the pipes for 2016 – new products, artwork, inventions and more, so stay tuned.

Happy New Year Everyone!

3D Scanning at the Brooklyn Museum

I was recently at the Brooklyn Museum for the opening of the exhibit Coney Island: Visions of an American Dreamland, 1861–2008. Its a great exhibit and I encourage everyone to go check it out! While there, I took the time to take pictures of several objects to use to generate 3D models of a few Coney Island Artifacts, as well as some beautiful architectural details.

This process of photogrammetry  or “physical photography” as I have come to call it involves photographing an object many times from all angles, taking care to ensure that each image is in full focus. Once photographed, software analyzes the image to find the same point in multiple images and generates a 3D model of where in space each camera was. From there, a point cloud and 3D mesh can be generated. Its a laborious process but its a very accurate way of generating 3D models of still objects like sculpture.

Here’s the processed scans:

Screen Shot 2015-12-04 at 6.45.31 PM

Spook-A-Rama cyclops head from Deno’s Wonder Wheel Park (Courtesy of the Coney Island History Project)

Screen Shot 2015-12-04 at 6.44.25 PM

Pegasus Statues from the Coney Island pumping station. (read about these here)

Screen Shot 2015-12-04 at 6.15.49 PM

Bacchus Keystone from the Brooklyn Museum Sculpture Garden. This scan came out amazing, with incredible detail to it!

Screen Shot 2015-12-04 at 6.34.57 PM

Another Great Keystone!

Screen Shot 2015-12-02 at 7.10.08 PM

Telamon (Male Caryatid) #1

Screen Shot 2015-12-02 at 5.59.32 PM

Telamon #2

Screen Shot 2015-12-04 at 8.14.15 PM

Architectural Detail from the Brooklyn Museum Sculpture Gardens