Please use a recent version of Google Chrome, Mozilla Firefox, Safari or Microsoft Edge to get the most out of the experience.Find a modern browser
I recently presented a paper at the Asia-Pacific Rail Conference in Hong Kong, including consideration of the implications of facial recognition on revenue collection, and therefore the design of stations.
This led to an invitation to attend an ideation workshop for Artificial Intelligence (AI) in Mass Public Transport in the Next 5 Years, organised by the International Association of Public Transport (UITP), and gathering together transport operators, software solutions providers and educational institutions.
UITP presented the interim results of a survey of appropriate users and stakeholders, and the top three responses in terms of the current applications of AI in public transport were:
This article shares my thoughts in response to the above, and is focussed on AI in public transport.
Whilst there are many ways to define artificial intelligence, in its simplest form it is the ability of computers to learn and self-improve.
IBM diagram illustrating the cognitive process of their AI system ‘Watson’
Artificial intelligence can be evident in:
The development of artificial intelligence comes at an appropriate point in our technological development, as we become overloaded and consumed by data, and more importantly the management, appropriate access, and use of this data.
Moore’s law defines and predicts the increase in processing power of computers against time. We are now at a stage in technological development where the collection and storage of data exceeds our ability to process it. This is known as the ‘data deluge’ gap.
Graphic representation of ‘data gap’ (based on source: High Performance and Embedded Architecture and Compilation, HiPEAC Vision 2015 Report)
It is also important to recognise that the output from a database may not necessarily be known or even recognised at the time of collection of the original data set. Data is being stored in many different formats, organisational protocols, and languages. It is clear that access and management of such data presents a huge challenge and one which can only be solved through the application of artificial intelligence.
Transport operators already collect and store huge amounts of data from every aspect of the industry, spanning travel booking to wear and tear. For example, an average metro train has over 2000 sensors providing constant and real-time information to the operator. It is the organisation and access to this data which is currently inefficient and impractical, but already increasingly assisted by intelligent search and processing algorithms.
Most of the public realm is covered by digital video camera, either in public or private ownership, from security cameras, secondary monitoring systems, and even information captured from person devices such as dash-cams and smartphones. The storage of such data typically increases with demand, but the efficient access to this raw information, and the recognition of useful data within it, remains impractical and restricted.
A recent breakthrough in research at Stanford University has allowed AI systems to learn to visually recognise elements within a scene, label them, and describe what they are seeing. To ‘see’, not just to ‘look’.
This may sound fundamentally simple, but the ability of a three-year-old child to recognise a cat or a dog, distinguish them, and do so irrespective of actions, lighting, shape or partial concealment is something that computing power alone has not, to date, manage to achieve.
The researchers recognised that the answer did not lie in improving the algorithms by which computers recognise elements, but providing computers with the ability to learn by quantum of experience and teaching (similar to a child), and also to have access to huge databases of information, such as Google’s open library of images.
Through AI, our computer systems will be able to search and recognise specific collections, patterns, and trends in datasets required to inform decision-making.
An interesting (and seemingly obvious) development is the ability to assess the emotion of people in an environment by counting the smiles! This is known as sentiment analysis, and is already being employed within a limited context.
This may sound simplistic, but we now have the ability for our computers to look at a scene, understand what they are looking out in even the smallest detail, process that information, and provide us with usable output data. This can be done far quicker than by human operators, and be applied to a database of information (such as Google’s Street View), traffic monitoring systems (such as Atkins’ North Avenue Smart Corridor in Atlanta), and live feed information such as CCTV or vehicle-mounted cameras
On a more pragmatic level, public transport seeks to be dynamic in its operation, and strives for an on-demand service resulting from inputs such as traffic flow analysis, ticketing and demand prediction. For example, this is applicable to bus services allowing for increased bus availability where required (and reduction of empty buses), and flexible metro headway insuring the optimum waiting time and carriage space availability.
An interesting concept discussed at the workshop is AI making the platform screen doors project a colour code allowing waiting passengers to understand where congested and free space occurs on the next train.
Facial recognition systems are already in use in many transport facilities such as Border Control and anti-‘jay-walking’ surveillance. Providing AI visual access into the public arena may provide value in recognising abnormal behaviour such as a lost child, a sick person needing assistance, or a person with mischievous intent. At a larger scale, the AI would recognise patterns of behaviour within normal or crowd situations (building on its experience), and alert operators to the likelihood of a problem occurring and a proposed solution (such as the closing of a station, the re-routing of public buses, or even pro-actively bringing the situation to the attention of the emergency services).
There remain constraints to the collection and processing of data, not least of which being cyber security, and the regulatory governance of this relatively new and hugely expanding field.
Although it is interesting to note that AI itself is one of our strongest allies against cyber security breach, as it can quickly and efficiently recognise abnormal or inappropriate behaviour in access to data.
As for governance, there as those such as Elon Musk who, even as a developer and entrepreneur of AI systems in his company’s products, foresee a point beyond which we cannot turn back…As with the internet, it is now part of our lives and impossible to remove. He raises the question: Should we give control of our lives to AI processes, and will we be able to turn them off if necessary?
Loss of control to AI is a theoretical situation well known to fans of science fiction, such as when HAL from 2001: A Space Odyssey (published 50 years ago) rejects the commands of his ship’s human crew, and goes so far as to kill them in favour of a more important task. (The name ‘HAL’ is derived from Heuristically programmed Algorithm computer, i.e. AI.)
Image of HAL: 2001 A Space Odyssey
Notwithstanding the content of any interaction with AI systems, a product of the system itself is its ability to communicate using natural language processing (NLP).
In fact, this article was dictated into an iPhone, wherein the AI interprets my voice, language, accent and sounds, not only using a database, but also learning to understand my particular way of communicating and the technical content, informed and taught by any minor corrections I may make. This AI is even able to predict what word I may need next, based on its learning of the way in which I construct sentences.
So an AI system is able to communicate with a user of a transport system (the travelling public), informing them (in any language) of travel options, changes, and advice regarding routing and destination. This may be through a smart phone or personal digital assistant as part of a mobility as a service (MaaS) platform, or through public announcement and information systems available within the public domain. This overlaps on the development of Augmented Reality (AR), and will inevitably result in context-sensitive information being available to transport users.
The operation and maintenance (O&M) of public transportation is informed through digital asset management (DAM) systems which communicate to staff through an easily understood interface. In this way, repair and maintenance regimes efficiently predict necessary action, alert operation and control staff to possible failure, and assist in accident, crime and emergency prevention.
The data from this holistic experience of the travelling public, and those responsible for operating and maintaining this system, can be fed back into the dataset allowing for constant improvement as a direct response to use. This understanding also predicts potential problems, and informs necessary actions in the event of system stress and/or failure.
It is widely accepted that public transportation cannot fund itself purely from fare generation. With the introduction and expansion of AI systems, according to Alok Jain (Managing Director of Trans-Consult Asia), it is possible to achieve 20-30% increase in revenue, and 30-40% decrease in the cost of operation.
Artificial intelligence is already part of our lives whether we realise it or not. From self-driving systems in cars, to the smart phone in our hands, these positive interventions will increase in complexity and scope, whilst becoming more intuitive and ubiquitous.
I believe that we have an obligation to respond to such technologies (if not inform and drive them). We should consider how they change the interaction of people with the infrastructure we create. As transport architects, we are particularly focussed on customer experience, the impact in station planning of ‘invisible’ fare collection, and need to future-proof our transit hubs against social change.