15 Feb 2019

Reality Virtually Hackathon @ MIT Media Lab

My colleague Lanh Hong and I spent the MLK holiday weekend at the Reality Virtually Hackathon. This event is unique in that it is "purpose-built" to generate new and innovative solutions using XR technology. It brought both students and professionals together to form over 100 teams. The event is run in conjunction with the MIT Media Lab.

The Forge team began sponsoring this event in 2016, and this year we offered a $2500 prize to the team that could show the best use of the Forge platform. Several teams were engaged with Autodesk technology, and 5 in particular showed some good results using Forge. Because this event is focused on AR/VR/XR technology, it was a good place to bring awareness to the Forge AR/VR Toolkit that's currently in beta. 

The first day was a full day of workshops designed to bring awareness to various technologies that could be used during the hackathon. On that first day, I presented a 1 hour workshop to introduce the Forge services and resources to get started quickly, and then on Saturday I also gave a separate community talk about Forge that was open to the public.

Overall we saw two teams using Model Derivative and Forge Viewer. One team used AR/VR toolkit with Unity for the XR requirement, another team used the Forge Viewer WebVR extension. The three other teams were using the Forge Reality Capture API to create custom assets. After helping the teams using Reality Capture to get started, we realized very quickly that the quick start is mainly showing a drone capture and does not focus on the best technique for capture with a hand-held camera. We were able to direct the teams to some Photo ReCap help, and also to another great blog post by Denis Grigor. Lanh also helped them to understand how to upload the files from their photo capture.

 

The winner:

The team that ultimately won our prize is called Sound Space. Their idea was centered around acoustic simulation and visualization - using VR Architectural Design to understand the acoustic influence on design. You can see their demo and code samples here. Their Forge solution used Revit designs with custom parameters assigned to interior surfaces. The Revit model was then translated in the Model Derivative service where the results were "previewed" using the Forge Viewer. Once the design was ready for the VR experience, the AR VR Toolkit was used to bring the model into Unity. The meta data from this process included the custom parameters that were used to analyze the wall surfaces and then in VR an algorithm is used to visualize the sound qualities of the materials.

 

Some other cool observations:

The Reality Virtually Hackathon event had over 1600 submissions, with only 440 of those accepted to attend. It was a nice, diverse crowd: there were about 100 teams from 35 countries, with 40% women. But not all were developers - about 20% of attendees were designers. All were introduced to Forge through the opening ceremony. Participants were encouraged to come with ideas and be prepared to collaborate with other participants, making new friends, colleagues and teams.

It was also cool to find Autodesk employee, Viveka Devadas, a product support specialist for Revit, on a team with an interesting project for a first-person Cambodian story during the Vietnam war.

The other teams I mentioned above that worked with the Forge APIs were:

All winners from the hackathon are listed here.

Also want to thank Sanjana Chand for helping to get this blog post authored. :-)

 

Related Article