NEW YORK, United States — Artificial intelligence and its more practical stepchild, machine learning, are undoubtedly having a profound impact on all industries. Business is built on predictability, and if any company can be effective at predicting the factors which affect its supply chain, costs, resources, processes and market, it has a better chance to thrive. These tools address problems in those areas and are coming into widespread use, but as thousands pile on the buzzword bandwagon, it is important to understand the limitations and usefulness of such technologies in the near term.
We’ve been living in the information age for a while now, but a constant problem has been that there is simply too much information for humans to process on their own. Enter machine learning. This simple and practical form of Artificial Intelligence is definitely not anything like HAL from 2001: A Space Odyssey or Mr. Data from Star Trek: not creative or truly thinking, not sentient or self-aware in any way. It’s still just a calculator. Think more of IBM’s deep blue chess computer, something that uses immense computational resources to try every possible move in order to predict an outcome. Cotton crops have been tracked in this way for a while now, using meteorological data, weather models and satellite imaging. We can use it to track and manage inventory, customer relationships, vendors and competitors.
None of this is very new. At a fundamental level, the computer simply follows a decision tree and goes down the line asking “does action X result in outcome Y?” In fact, you can code that yourself fairly easily in a spreadsheet for simpler problems. What makes machine learning a major new frontier is that algorithms can be written by data scientists to increase processing efficiency. Instead of trying all 900 billion possible combinations of variables to solve a particular problem, you can group them and do the math on a much smaller selection to get a similar result within a reasonable degree of certainty. Organic and analog complexities overwhelm interpretation systems until the interpretation systems either overwhelm the complexity with just raw processing power or you use innovative shortcuts. Good old fashioned human-coded algorithms allow computers to make the leaps.
The business case for doing this is in being able to better predict market trends, process vulnerabilities and other factors which affect the bottom line, but also there is a real cost for the level of processing required to sift through all the new troves of data we’re collecting. A big fashion company might spend hundreds of thousands of dollars per month on cloud computing to process all the new data it is getting from suppliers, retail platforms and market intelligence, but a good data scientist can tweak how the processing is being run and get it down to a fraction of the cost, while at the same time providing new insights about the data. The human is the key because the machines aren’t really intelligent, they still just do what operators tell them to do, albeit with fantastic speed and accuracy.
What is the human quality that A.I. lacks? Intuition, the ability to leap beyond logic. As a computer scientist might put it: IF a traditional computer can leap beyond logic, THEN it is broken. Anything else at this point is science fiction at best.
Machine learning is all about how complex the simulation is that you’re using to teach the machine. As complexity goes up, so do resource requirements. How accurate can a digital model of real-world conditions be before the details clog the process?
As global fashion supply chains become more automated and transparent, the utility of systems which can do this safely and efficiently becomes very clear.
Frequently when people talk about machine learning they talk about training. Think of your machine learning system as an intern. In the beginning, it knows nothing about your business. It starts out clueless, fails a lot, but then (hopefully) it starts to learn. Choosing the right model for analysing your data is a lot like interviewing candidates for an internship. You have to judge them on what it seems they’re good at doing at a basic level, but once you employ them you still have to train and monitor them until they get up to speed, and sometimes the first selected candidate is not a good fit and you need to start all over.
Today most customised machine learning applications are built atop enterprise-quality subscription-based modules from vendors like Microsoft, Google and SAP, and free open-source shared code libraries maintained by thousands of programmers who keep them updated and secured as a global community. While innovation is ongoing, much of the infrastructure is already built and available to deploy fairly quickly. Training your new system to interpret data so it is useful is the hard part.
Artificial intelligence connects in interesting ways to another buzzword topic: blockchain. Contemporary blockchain technology has built-in smart contract capabilities, which allow for automatic payment when a specific task is completed. By connecting a blockchain payments system to databases and sensors, an A.I. program can be used to determine whether a task has been completed and payment should be made. Did 500 units arrive in the store? Yes. Pay the vendor. And don’t worry too much about the immediacy of blockchain transactions, as there are already several regulated blockchain exchanges acting like banks and providing businesses with liquid capital in this new realm. Credit isn’t going away.
A.I. is destined to replace middlemen. Need to source something? A.I. will do it best. Need to manage a task or a large team? It watches and learns all the time. So A.I. will become the perfect middle manager. Yet, you probably still want a human manager doing final approval of transactions until the A.I. has performed the task properly for a good long while. It’s really not going to start out smart, and has to learn from mistakes. But the more we use these systems, the more reliable they can become, to the point where they’re going to be far better at the middle management tasks than any human ever could. And since they’re being built on shared platforms, they don’t just learn from you, they learn from everyone, so you don’t really have to do all the training. As global fashion supply chains become more automated and transparent, the utility of systems which can do this safely and efficiently becomes very clear.
We don’t know how complex and intelligent these systems will become in the long term, but right now they need a lot of work. They cannot predict things that are not based on patterns they’ve not seen before, and can be tricked into integrating bad data fairly easily, so real innovation and big new cultural changes will generally throw them off. Perhaps the best response to the question of whether artificial intelligence can ever really understand fashion is simply to ask another question: Do we?
Charles Beckwith is director of communications at Save The Garment Center and the host of American Fashion Podcast on MouthMedia Network.
The views expressed in Op-Ed pieces are those of the author and do not necessarily reflect the views of The Business of Fashion.