We are driven to explore the intersection of travel and technology and the automobile is approaching the apex of revolution. In a world where “Google” has a drivers license, transportation design and the significance of the automobile is shifting. It is still a prime symbol of independence, but it is also a sanctuary; one of the few remaining places where we can have private time. While a computer does not take our hands off the wheel just yet, the more technology takes over, the more time we’ll have to catch up with friends, to work, or to be entertained. The focus will no longer be on just controlling the vehicle, the new question is how do we interact with what we love at 60mph? To design for this new connected + autonomous + shared car of the future we need a platform that is flexible enough to adapt to these new trends, while providing a consistent way to measure the impact of new concepts.
To meet these needs it became clear that a simple 3D driving simulation, to generate the cognitive load of driving, was not enough; after all the car may be driving. To explore the connections of the car of the future, we needed something that allows for simulated driving along side emerging infrastructure and sensors. Simulated data from city infrastructure, cloud based traffic services, hosted natural language processing, car based computer vision systems and even simulated vehicle to vehicle communication. The passenger does not have to understand these systems, for designers however, these connections are the key to future driving experiences. Simulation is a crucial part of understanding these connections, it gives us the ability to rapidly prototype new experiences. So we went looking for a platform to perform these simulations… Funny thing, we couldn’t find anything that met all those requirements, so we built our own and we call it JoyRide.
JoyRide is made up of five components: a virtual world complete with virtual car, a event authoring tool, a physical mockup of a vehicle, an open human machine interface (HMI), and a messaging based software architecture.
The virtual world creates the driving environment, from roads and traffic to weather.
Though built in Unity, JoyRide is not a driving game it is a true simulation. It provides statistics from the virtual car in the form of a simulated real time ODB2 and GPS data feeds, and virtual city infrastructure like traffic signals. It brings the core driving experience to the platform with the following features:
- A driving physics engine including integration with a force feedback steering wheel
- Real car statistics, speed, energy consumption, RPM, location, direction of travel, etc
- Signals that mimic “smart” lights i.e. only activating a left turn signal when a car is in the left turn lane
- Traffic that responds to current conditions, taking free right turns, rushing the left turn as the light turns red, etc.
Messaging using MQTT allows all the parts of this simulation to be loosely coupled. Providing the flexibility to incorporate new technology and the ability to connect to real world data, weather, traffic, communication, etc. This architecture provides a consistent way to send data through the platform creating the following benefits:
- Simple standard to integrate new technology
- Support for connections to external cloud based data services
- Measurable logging of all user and system interactions
A big part of the future driving experience comes from how events inside and outside the car come together through the journey. The authoring tool is built on Node-RED and lets us orchestrate events to quickly prototype interactions by defining the conditions. for example: “when the driver turns down Main Street and gets within 3 blocks of the cleaners send a message indicating their dry cleaning is ready to pick up”.
With this tool, our designers can define not only when that message is sent, but how it is shared with the driver and what actions they can take with that information. Using the authoring tool our designers are able to create very elaborate scenarios with simple “if this, then, that” logic without the need for custom development.
Regardless of the layout of the interior of car, we are guaranteed two things: there will be a windshield where people see the road ahead, and there will be places for people to sit. The mockup provides the windscreen, a generic dashboard and two forward facing seats. Placement of control surfaces and information displays are all left open in JoyRide, so we can try out new configurations. These simple, physical queues provide a contextual environment for placing traditional automotive technology and let concepts take on the feel of being in a real vehicle.
The open HMI is everything the user experiences, from the buttons on the steering wheel to the way information is presented across all the displays.
The traditional layout of an analog instrument cluster and head unit is quickly fading away. The open HMI framework lets us try out new graphical displays, physical knobs or button configurations, and even gesture based controls. It provides our designers with a wide palette of features that they can build from including:
- Support for screens of any size or resolution
- Addressable displays to coordinate what is shown where in the scenario authoring tool
- Support for rapid prototyping using Adobe Edge
- Dynamically reconfigurable physical controls (knobs, switches, etc.)
- Support to display controls on mobile phones and tablets
With this platform we can truly start exploring the future of the connected car. We have several experiments built on JoyRide that we are looking forward to sharing with you soon. Stay tuned.