Small Update – SoundScape1.3

Just a quick update managed to put a build of my game onto my android tablet to test buttons, display size, general working status on another device with a lot of success! I had an issue with the script declaring that most of my scripts were undefined and weren’t recognised by the java parser for the android tools. I fixed this by finding a solution online that indicated I needed to insert var at the beginning so going from
script = GetComponent(Artefact2SoundOn);  to
               var script = GetComponent(Artefact2SoundOn);
This is so the java parser which has apparently become more stringent since earlier builds can recognise unitys script function as a type now to cut a long story short.
I also had to code the buttons for the player movement which I got a some info on from various places around the net the best source being at http://answers.unity3d.com/questions/143317/how-to-emulate-keyboard-keys-with-a-gui-button.html which was an answer a member of the community gave about emulating keyboard keys using GUI buttons. The actual solution came as telling the GUI buttons to emulate axis inputs. So the code basically said take Horizontal and Vertical as variables, Store these return values in the variables. Then if the GUI buttons are pressed and created basically Do axis corresponding to whichever set so move the player along an axis. I had to change the order for the W key (Walk forward) as the code was executing axis 1 before the animation which was playing the animation every frame so I called the other object with the walk animation for the collider first and moved the getotherobject.play.animation.walk to before the script was checking for input/existence each frame and voila working contorls!
Blurry shot of menu working on dev build
Working Dev Build Menu
So stuff to do today, Finish off level 6, Write an AI code – I MUST DO THIS! Start an asset spreadsheet of sounds needed per level. This must be completed by today/tomorrow latest as I need to get onto the sound design very soon.
Stuff to do in the immediate future: Change UI layout to be more user friendly, Add GUI skins, Add voice description of menus, Enlarge buttons in menu,

Post Christmas Development Work

Wow It has been a while since I updated here. Currently I have picked up the development again for my Audiogameproject1.01 as it is referred to at the minute (I’m sure there will be many more versions.) I have made several additions to the current game in its state.
Changes I have made to the previous revision of audioproject1.00a
I have added a holding screen so that the motion tracking can actually wait to pick up detection and then wait for the player to perform an action. The secondary main menu screen is not really a menu but is a hub where players actions can determine the selection they want to make by using a gesture. I have decided to design the UI this way for now as it seemed like the simplest most logical way to develop it. Plus my knowledge is highly limited on how to make GUI text within unity. Click functions are not available to the player in this game in its current guise so I had to think of the most logical way to implement menu navigation choices and a static screen that has text and audio telling the user what to do seemed the best route. I had a difficult choice before coming to the conclusion that I needed both the holding screen and the menu due to if a player decides to move about too much while the motion detection is in operation before the game initiates. This will solve a few problems for me and generally help to ease the player in to the game as opposed to just chucking them straight in the game. It will also make sure the tracking is working properly before the game initiates and specific actions are required. I need to improve the GUI if i have time as it looks terrible from a design point of view but its functional.
I have extended the play area for the cave environment to incorporate 2D/3D designed sounds for things such as bats and a small piece of music to detail a mystical inside to a cave. I have fixed bugs in the cave area associated to level progression blockers and created a new script which waits for 5 seconds before destroying the target entity the script is associated to.
New play area Beach level is in the process of being built, Walls, floors and Basic environmental sounds/mixing has been added. Events and Gameplay need to be added/decided upon. Design changes may need to be made. Adaptations to the original level plan may incorporate an addition of a metal detector for one of the artefacts and story details such as hearing bombers overhead have been added to fit in with the theme better and improve narrative.
Player component SensoryPlayerController has been retired and changed in line with mazeplayercontroller. – this controller allows a player to feel surfaces and hear audio feedback if the hand object collides with a wall. Mappings to the motion tracking will need to be added for this extended functionality. This change had been made to Cave, Haunted house levels and added to beach level. This will become the new standard for the player character until/if a new version is made.
Crouch has been decided upon as a way to exit the game as it is a non deliberate motion unlike swaying about. This may be subject to change if crouch functionality is required. Players will be told about this using audio.
The woodenstairs sound has been used to test the wall sounds in each level that I have adapted the new mazeplayercontroller to.
Schedule
I was ahead of my schedule before christmas break my target is to get the levels built week by week allready 1/6 levels are playable and 1/6 rifts link. 3/6 levels have an available play space to walk in. I aim to have at least 2-3 levels playable by the end of the next week.
Functions I need to add/explore
Metal Detection – Pinging a sound back from a player action such as holding left arm forward
Understanding hold and destroy script in conjunction with playing a sound – Can I play a fade out sound for artefacts before they get destroyed. The current sounds are very choppy.
AI – This must be looked at even if not used. AI may be used for the monster in haunted house and Sentinel in underground level. Also may be incorporated to roboguards in futuristic level.
Holding screens before missions – Need an explanation of some missions before the mission. Speech inside the missions need shortening or moving to these as prolonged explanation causes audio to play over other audio and gets real boring for players.
That’s my update summary for this week I will start posting one of these per week or few days as needed. This is for the purpose of documentation for my final project text.

Beginnings of the final project

I have started work on my audio game  by building a prototype in Unity I have currently learnt how to map controls, make a first person control system where the left and the right rotate and the forward controls for footsteps can be mapped to two buttons.
I have learnt from tutorials how to script and use triggers for footsteps and to activate audio cues.
I need to learn how to script simple AI that a player can interact with to take health off the enemy/player and be destroyed.
Currently the project is meeting schedule. The audio recordings will need to be planned in the coming week as well as the level design.