Context

The assignment requires us to show the understanding of UI/UX and knowledge of visual hierarchy, typography and information design in an interactive setting by designing the controlling interface for an autonomous vehicle.

Research & Learning Process

Firstly, I tried to find out what an autonomous vehicle is and how it works.

screen-shot-2017-01-12-at-23-30-25
conventional control wheel and autonomous car
irobot
vacuum robot

After doing some research, I consider an autonomous vehicle should be capable of sensing its environment and navigating without human input. A product comes to mind is the ‘vacuum robots’ on the market nowadays. It fits all the criteria of being a basic autonomous device: it does the work charges itself with very little human input.

Then I have the idea about a robot using the same principle, but it scans instead of cleans, to explore a remote area or another planet. Since the design of the vehicle is not a part of the assignment, I decided to use the character Eve in the movie ‘Wall-E’ as my autonomous scanning device.

I then started to think about the functions of the device and elements of the interface. I tried to find out what diameters should be included and what kind of button should be used.

I also looked at futuristic interfaces in Sci-Fi movies, but at the end, I found out they contain a lot of meaningless lines and dots in order to create the ‘high tech’ feel. The complex interfaces are not suitable for an autonomous device and only making it more difficult to use.

Then I tried to look at the control system in the real world.

The photo on the left shows the control interface of the London Underground system. The panel with buttons is in very simple design and almost look like a gaming machine on photo next to it.

When I looked at the photo of the NASA’s control centre, I realised the screens take up a lot of space and people are responsible for different tasks for a complex project.

NASA control centre
NASA control centre

Therefore, I started thinking about the concept of using virtual reality system to replace the room filled with monitors and people. The VR system itself required very little space compared to the current control centre. There is a ‘stage’ that senses whether the people step on is authorised. The sensor also detects the movement of the technician. The technician simply needs to step on the stage and put on the VR helmet to start controlling the autonomous device.

ideas

I then got stuck with the VR interface, so I tried to look at the functions of the controlling system. I realised there need to be different levels of control and it is not necessary for general users to have all the options.

I also thought about different scenarios and icons for the interface and came up with a colour coded control interface, in which the available options will be showed in green colour and the activated options will be in red. A button will be in yellow when the task is processing.

After doing research about implementing a VR environment and controlling system, I realised I do not have the knowledge and felt that I would not be able to finish such a task before the deadline. Therefore, I changed my plan to making a simple mobile app for general users to monitor the findings of the device. I decided to apply the same style as in the movie and minimise the use of text in the navigation menu.


eve_plant

8-icon_style

Applying user flow knowledge

We had a workshop about designing the user flow during the term, during which we were asked to pay attention to some daily tasks and recorded every step of them. This practice helped me to think about all the possibilities. When designing the interface, I think about what message should be shown in situations that users do unexpected things.

When filling the app with data, I look at the figures online in order to make the interface look real. Both of the charts are reproductions of the actual charts with figures of the Jupiter.

screen-shot-2016-11-24-at-22-21-48

chart_atmospherechart_wind_speeds

The Outcome

Mobile App Prototype (version 1)

This slideshow requires JavaScript.

Indicators that light up respectively when elements are found on a planet

indicators

Click here to preview and interact with it

Mobile App Prototype (version 2)

During the critical session, some classmates pointed out that the navigation buttons are too big. I also found that the ‘back’ button in the top left corner may cause confusion. In the second version, I used smaller navigation buttons and a ‘home’ icon to link back to the summary view.

report

Click here to interact with the second version

Demonstration

Available at https://youtu.be/yR4iaayd1uM

Evaluation

In order to finish this project, I learned to use the iOS App POP to create quick prototypes and Adobe Xd for a refined prototype. I also tried different online tools such as Marvel and Invision. During the design phase, I realised the importance of having enough information and knowing the needs of end users. I consider myself spent too much time on thinking about the concept and functions without realising the limitation of technical knowledge, which caused the lack of time to implement the interface. I found building a 3D environment using UNITY would be a big challenge for me and I hope I will have the chance to learn the software in the future.

After finished the second version of the mobile app, I asked my friends for feedback and they preferred the latter one. The reason behind having the big buttons in the earlier version might be that I spent a lot of time in deciding and drawing the right icon. From this experience, I understand that It is crucial to have someone not a part of the design team to look at the outcome and give suggestions.