DSP Astraunatics Project

Create a Simulation for the Astronautics Department of Kingston University based on the "Mission to Neptune Report" produced by a student group in 2013.

Pages

Monday 19 January 2015

Final Reflections

Since we were submitting the project today Thought I should put down some of my thoughts regarding the project.

Summery, briefly 

 We started out the project to develop a simulation for the Astronautics program of Kingston University, but over time between discussions with our mentor and group discussions the Objective of this project has changed to a recruitment campaign for the Astronautics program. We attempted to achieve this by using new attractive technologies in context with the Astronautics program.

My role in this project was to assist out team's technical lead and help out where ever needed. A good part of my time was spent on research of new technologies and their implementation. The team worked well together and we were able to achieve our intended objectives fairly well. I played a key role in developing the augmented reality application and also helped out in developing the basic code for the the compositional data collection mini game. I also wrote the scope section of our final submission.

Reflecting on the project

Looking back, I think as a team we were a little indecisive. The structure of our project kept changing almost to the end. I was very happy to work with my team, they were a great bunch, and as the project changed we all came together to make this project work. However, I do feel that a little bit more time in refining the ideas would have led to and even better end product as all have that potential within us.
Particularly the augmented reality application was thought up very much towards the end and did not have time to mature as a product. Also we did not have time to test out our minigames extensively due to limitations in time and access to the oculus rift.
From a developer's point of view, during the early stages of the project, we as a group were looking in to and researching all sorts of things. Seemed very haphazard and disorganized, but I feel that through all those ideas we came refine our objective and collectively decide on our final product. However, I must point out that it shortened the time for developing the product. most of the development work was done towards the end of the porject.
This was big learning experience for me and I am sure of the others as well. We were not restricted by our roles and helped out in researching and sharing ideas as a group.

in conclusion this was a fun learning experience that has produced a product for the Astronautics program of Kingston with a lot of potential.

some personal remarks

For me personally, I am glad to be on this team. most of this project were new things I had not worked with before. This was the first time I had been working with unity and the Oculus rift. A good part of my time was spent on learning the tools available to me and going through tutorials. I had fun researching and developing the augmented reality technology, this was also a first time. I think some of the more important lessons I have learned is the importance of project management to efficiently work towards a common goal.

Sunday 18 January 2015

Minigame4- Compositional data collection game



Since i am assisting Rob, I worked on this minigame. As this is going to be integrated into using the Oculus, I created the game objects and game logic for just this game. when rob integrates it into the minigame set, the camera controls (for the rift) and skybox will be added.

Introduction

One of the missions in the ‘Mission to Neptune Report’ is to collect Compositional data of Neptune. In order to simulate this, the player will have to look at different parts of Neptune and press a button. Mission is over once the player has collected all the 100% of the data of the time has runout.

Concept

Neptune will contain four empty collider objects inside its body. Using the ray cast function, a ray will be drawn from the center of the camera forward at the press of a key. Every time the ray collides with one of the colliders, that sections data would have been collected. Once all four collider object’s data has been collected the game is over.
The main feature of the code in this mini game is the raycast function from the center of the screen. and a screen shot of that section


Ray rayOrigin = Camera.main.ViewportPointToRay(new Vector3(0.5F, 0.5F, 0));
RaycastHit hitinfo;
if (Physics.Raycast(rayOrigin, out hitinfo, distance))

 
I also used the GUI.lable to view the outputs here

Conclusion

Though this simulation oversimplifies the science behind collecting compositional data, it contains the fundamental idea of the collecting compositional data as mentioned in the report. This game could be made better by using more collider volumes inside the Neptune body. In order to make this game more scientifically accurate we would need to segment sections of Neptune’s surface and detect ray collisions with it. and based off those surface locations compositional data is collected. To be even more accurate the game should provide something similar to actual data but since we dont know till we actually carry out the mission, the game will proberble say something along the lines of mission complete

Final android app

The android app is finally done. I got Neptune from Rob and used the new Triton. Also added a KU logo. I have put it up on Dropbox under the augmented reality folder. its called KUNeptune.apk. you can also download it here. The trigger Image is still the same, our logo. This brings up the question should we include that image in the report? Also a screen shot of what the final version looks like. I have also put the AR section of the report in dropbox in the same folder. you can also download it here.



Monday 5 January 2015

Bug in demo1

There was a bug in demo1 of the android app. the reason was that the 3d models must be made a child of the target image in unity scene view. only then will the models render when the target image is tracked. otherwise the models will always render. Though it was a simple mistake, I ended up redoing the whole thing before I figured it out.
Here is the link to download the new android app, demo2.

AR Demo1

Using the the 3d models the group created, I created a small app that can be used as a demo with a little bit of code to simulate planetary orbit. I have put the app up on dropbox as well as google drive (link here). it uses the same trigger image and the models are the moons that the group created. The lighting is much better here.

Augmented Reality App

Since the team has decided to go with unity to create 3d images, I researched and generated an app that will produce augmented reality in 3d for the android phone.

  1. The first step was to create the code required for the trigger image to initiate the augmented reality features. I used 'vuforia' for this. Primary reason being that its free. The vuforia developer website will generate the code for the trigger image in unity. I created an account and uploaded a trigger image, in this case it was simply one of the logo ideas that we were thinking about for "oceanus simulations".
  2. In order for unity to recognized vuforia, an extension for vuforia compatible with unity was downloaded  and migrated into unity in addition with the code generated for the target image.
  3. I added the 'Image target' asset and 'AR Camera' into the scene and set the 'track-able image handler' setting to the preferred image target script. 
  4. Add a few 3d models for testing and lighting along with some simple animations. I should mention here that the example 1 i created needs more lighting as it looks a little dark. I was only able to notice this after I created the app, so I will have to keep this in mind when i do it the next time. an image from when I was playing around with unity. used some particle effects and rotation animation for the sphere.
  5. the final step is to build it for the android. I had some trouble in this stage as unity required the android sdk libraries. However, for some reason my laptop did not want to install the android sdk. after many trials and reading up on forums I was able to solve and build my example project. One of the major reasons for the problems I was having was administrative access. I decided to work on my laptop as it was easier with the webcam, as I can test out the apps before building it.
  6. here (can be downloaded here) is the built android app. It needs to be installed on the phone and when running point the camera at the target image (can be downloaded here). the downloads are on google drive. I have also uploaded them on dropbox for our team.
  7. next step is to create the AR designed by the team.
  8. also prepare building apps for other platforms.

Sunday 4 January 2015

New Deliverable

We decided as a group to have 3 separate pieces to our deliverable.  As the primary objective of our project is to attract students into the Astronautics program, some flare in technology will capture their attention,
  • oculus rift game
  • presentation in the background
  • augment reality app.
Augmented reality
We wanted to get some thing that would capture the audience and make them think this is cool.
So first I looked into the commercial augmented reality apps. Currently there are many games and travel apps that use augmented reality ARDefender and Travle Guide. I was particularly interested in Aurasma. 

Aurasma
This is a free app that can be downloaded in the app store, so it has no compatibility issues weather is android or iphone or windows. Its free to register an account and upload a trigger image. Trigger image is the pic that the app will recognize and initiate the augmented reality features. For a viewer to use this they simply download the wrapper from a specified account to view the augmented reality for a particular image. Desirable auras (augmented reality elements) added as media on the trigger image in the developer page in Aurasma. Aurasma Demo link here. The downside is that the media we can use as auras are limited to video and image media, as you may have noticed in the demo video.

Unity AR
We could use unity to create an augmented reality. a quick sample image here. I used the oceanus simulation logo to show 3d models. I used my phone to display the trigger image as i did not have access to a printer at that time.





















The advantage of using unity is that of a 3d model unlike in Arasmus. However the downside is that, if we were to use this, either we have to supply the hardware to with the app installed or have various compatible versions ready for user download. The link to this project is here in dropbox.

In comparing our Options, the team decided to go with unity as using 3d is more attractive.