This project's funding goal was not reached on April 5, 2013.
This project's funding goal was not reached on April 5, 2013.
At Augmented Mountain we have been developing Augmented Reality (AR) apps for almost two years. We do this using our powerful and unique programming language called Palimpsest. The name pays homage to Archimedes' manuscript which contains hundreds of pages of math and theories explaining and questioning reality. Our version of Palimpsest allows us to build dynamic interactive computer generated environments which transform real world geographies. While developing AR apps at Augmented Mountain we have been improving Palimpsest every step of the way. AR is a new frontier in digital media and Palimpsest can lead the way with your help.
We are pleased to be kicking off our campaign at the Scope Art Fair in NYC. Augmented Mountain has partnered with Scope to develop an ongoing app called Outer Spaces. Outer Spaces builds augmented galleries for our clients to showcase whatever they like. The Outer Spaces app has built galleries for Red-Bull, VH1, Fiat, and a wide range of art organizations and museums.
Right now we are at Scope in NYC which is being held at the huge Post Office in Midtown on 8th Ave between 32nd and 33rd. If you are in New York come to the fair and visit our Outer Spaces booth... Let's talk about the future of AR and enjoy tons of art. We wanted to synchronize our fundraising with Scope because they have been our first big contributors to our Open Source Project. Over the last year they have helped with R&D by giving us office space in Brooklyn, booth space at the fairs in Miami and New York, and introducing us to a hundred interesting people who are building bridges to join programming with art.
Open sourcing Palimpsest will revolutionize the world of AR, but we need your help to get it out there. There are a few critical steps we need to take in order to ensure you get the best AR toolkit out there. These will require all of our time, another C++ and Lua programmer, a web graphic designer, and a lot of lunches for our interns and volunteers.
Palimpsest is a unique AR language based on Lua, that makes creation of AR worlds, fun, quick and easy. With Palimpsest you can create complex and dynamic interactive, immersive, and user reactive experiences from start to finish in a matter of a few minutes to a few hours. This may take you days, weeks or even months with conventional, roll-your-own methods that are currently out there.
Step One – Take thousands of lines Palimpsest code and organize them into a central trunk.
Step Two – Finish coding of the core features for Palimpsest 1.0. These features are listed below. Many of these are already finished, but there are some that still need quite a lot of work and testing.
Step Three – Test, test, and more testing of Palimpsest 1.0 so you won't have to deal with bugs when it is released.
Step Four – If we meet our stretch goals, we will be packaging Palimpsest into a user friendly interface. This will be great for those who aren't programmers. Think of Adobe Creative Suite with a toolkit that allows you to make augmented reality apps. The more money we raise the friendlier version 1.0 will be.
Upon release, Palimpsest 1.0 will offer:
- Organic User Interface options.
This is a basic set of View management objects that allow you to show images, text, and other 2D graphical interface objects as an overlay to the 3D immersive view. For the basic goal, these will be simple interface objects. If we meet our $100,000 stretch goal, they will be updated to be more complete, responsive, and have a look and feel that is native to any device they may eventually run on.
- Visual display.
Palimpsest can project 3D models over a camera display so that they appear to be located at specific GPS locations of your choosing. 3D models can be made in your favorite tool (we recommend Blender), and imported into Palimpsest. We currently use a special version of the wavefront obj filetype, but support for other formats may be added in the future.
- Unique GPS location system.
Palimpsest envisions places as more than just singular points on a map. Places are collections of objects that are related in some way. A place is it's own origin that 3D models, sounds, and other objects can be added to. If a place moves, it's collection moves with it. This allows for much more dynamic and interesting interaction with physical space than the conventional single-location-single-object model of other systems. Palimpsest also has many tools for calculating locations that account for latitude, longitude, altitude, and the curvature of the earth.
Palimpsest adds an event based system to it's Lua under-structure. You can add proximity detectors that react to a users movement into and out of an area, you can react to the users tilt and heading or compass direction, you can detect touch events, or other user initiated actions. There are high level event structures, like proximities, or if you need finer control, you can react directly to every GPS, tilt, or other movement.
- Timer based animation.
There are currently two types of timers, Interval Timers and Line Timers. An Interval Timer fires at discrete intervals, for as long as the timer is told to repeat. Line Timers fire successively over a period of time, until that time is reached. Both are effective for creating animations and events that need updating at periodic intervals.
- Dynamic 3-D sound.
We use the OpenAL system for playback of sound. Sounds can be attached to locations, and will appear to be emanating from those locations. We support a multitude of sound compression formats, although some formats present hardware limitations on the number of sounds you can have.
- A dynamic and powerful scripting language for creating AR worlds
- The ability to create independent apps
Palimpsest is compiled as a Framework for Xcode. Just import the framework into your own project, add your own artwork and icons, throw your Palimpsest scripts in, and you are on your way! We will even make pre-made templates to get you started, but releasing your app is up to you.
- Debug mode.
Debug mode allows users to bring GPS located projects to their location if they are unable to physically visit a place. For instance, if you are making a project that is to be displayed in France, but you live in San Francisco, you may want to test your app without having to buy a plane ticket. Just turn debug mode on, and view your AR world from anywhere you want. Debug mode even works when there is not a reliable GPS signal.
The system is fully, and transparently multithreaded in the background, so that loading of objects and other time consuming tasks don't interrupt your ability to interact and react to the user.
- In app web browsing
Palimpsest incorporates a fully functional web browser, and exposes an interface so that the web browser can interact directly with your Palimpsest script. You could make an app that loads a new web page whenever the user enters a place of interest, or you can bring up an html page that lets the use make a choice from a menu that then gets processed by your underlying Palimpsest script. The possibilities are endless.
Palimpsest 1.0 will incorporate a powerful, transparent networking interface. Want to store models and sounds on your server, and load them dynamically depending on where a user is? Just pass a url to your model, and the system will fire an event when the model is finished loading. Or if you want to store user preferences so that they can be shared with other users of the app, just tell the system to upload your file or text, and the system will take care of the rest. Want finer control? Setup a callback that alerts you when new data is available during the download process, or requests data during the upload process.
Palimpsest will be open source and so will be any modifications made to it. As scientists, educators, business owners, artists, storytellers, and enthusiasts of all kinds build AR worlds with Palimpsest, the trunk grows and the possibilities will become infinite. We believe that Augmented Reality is the future, and we want to fostering the community by creating tools for you to express your imagination. We want to see what you will make.
Your Imagination, Any Place.
If we go beyond the initial goal then we will release all this other stuff...
-We add JSON support. With JSON, you would define your objects and all of their important features in one simplified script, which then can be imported directly into Palimpsest, or be included in a Lua script. In web terms, think of this as the layout engine (or HTML equivalent) for Palimpsest. While Palimpsest is already easy and fast to learn and use, this would make it even faster to create dynamic AR worlds, and make it more accessible for those who don't want to have to do as much programming. This would also simplify the automatic creation of scripts by server-side web tools.
-We add a frame based animation system. The basic Palimpsest system has support for animation by using timers. The new animation system would be as simple as providing a start point, end point, a time interval, and an object (or objects). Animations can run once, or repeat as many times as you like. And if you want more fine control, you can always add callbacks that let you know where in the animation sequence you are at. The animation system will allow for aggregating objects, so you can define an animation once, and use it multiple times and with single objects, or groups of objects. It will also have both an absolute mode, for objects you want to move based on a known coordinate system, and a relative mode, for when you want objects to move from the position they are currently in, to another relative position. Finally, the animation engine will be tied to the frame rendering loop, so that it is scheduled to operate at the most opportune time for the most efficient operation.
-We hire a professional tech writer to write documentation for Palimpsest. In addition to the simple “getting started” guide and function reference, at this level will allow us to document every major feature of the language, in a way that is easy to understand. We already have a successful, published tech writer, who has offered his services at a premium. He has used Palimpsest in it's very pre-alpha state, and is intrigued by the possibilities it presents. The documentation will be released as both a web-based manual (perhaps through a wiki), as a pdf, and as free ebook.
-We create several tutorial scripts and projects, in conjunction with the manual, that detail how to use Palimpsest. We will cover functionality from the most basic how to get started, to the most complex interactive features. Use the tutorials to get a deeper understanding of the underlying principles, and then as a jumping off point for your own creations.
-We will add lighting and shader effects so that you can have more control over the look and feel of the objects within Palimpsest. There is currently no lighting system in Palimpsest, so all objects must be pre-rendered with any shadows or other lighting features. With this added feature, you can control lighting on the fly, and add complex surface transformations limited only by your imagination. Imagine being able to change the way a 3D object looks based on the time of day. Or making objects appear to be watery, or appear to have a “shiny” surface. This would make all of that possible.
-We provide a library of ready made 3D objects, sounds, images, and scripts for you to use as you like. These will be made freely available to the community to include in your own projects, or to work with when going through the tutorials provided at the lower tier. We will also provide a library of drop-in script components that have pre-made behaviors. Use these scripts with our library of objects, or use your own. Modify the script to your liking, or use it out of the box. Use your imagination.
-Palimpsest gets a user friendly developer application on the Mac. The application would have a graphical interface, with easy to use controls that automate the creation of AR worlds. Drag and drop 3D objects, move them into place, animate them, add sounds, user interactions, and proximity and other user event processors, all without touching a line of code. The system will be live, so changes you make will be visible immediately, without having to compile or re-run anything. And if you want more control, you can access the underlying script and modify it directly by hand. For the programmer, you get a tool that directly aids you in the creation and testing of your Augmented Reality app, without taking away any of the control over the details that you love. In addition, this product will allow complete non-programmers to get into the action, and start letting their imagination run wild.
-We make the above developer application available on Windows.
We add Android support. This will widen the reach of Palimpsest to a much larger audience. Android support will include all the features above on devices which can support them. There are a wide range of Android devices, and available hardware and software features that each device can support. So we will make sure that Palimpsest runs on whatever the current developer device is from Google, as well as several of the most popular devices, as long as they support the hardware and software requirements of the app.
We add surface recognition. Palimpsest was originally created as a GPS based system, which means it requires a GPS signal for locating objects in space. Other AR systems available on the market use image processing techniques to allow objects to be attached to surfaces, through recognition of tags or other visible features. This has several effects – it brings the ability to use AR into spaces without reliable (or any) GPS signal, and it tends to track more accurately than GPS. On the down side, it can often lose it's tracking ability if the image it is following isn't fully visible, and doesn't work so well outside, where conditions can be much more variable than a controlled indoor environment. When used in conjunction with GPS, the downside of GPS (no tracking indoors, can be jumpy outdoors), as well as the downsides of image tracking (not so good outdoors, can lose focus if the image is not clear) can be mitigated by using the strengths of both together. This stretch goal would allow us to put in support for image tracking techniques, that can be used in conjunction with all the other features that Palimpsest already supports.
Without your help it might be another year before we complete Palimpsest 1.0 and release it. By that time you might have had to compromise by paying for something that does a lot less than Palimpsest can do. If we make our goal, in a year, you will see Palimpsest 2.0 and that will be proof that Palimpsest will always offer you the best AR tools for free!.. Let's get there.Learn about accountability on Kickstarter
Have a question? If the info above doesn't help, you can ask the project creator directly.
- (30 days)