author Stephen Forsellus, Chief Technology Officer Easy Buyas well as UFI Digital Innovation Working Group.
There is no doubt that Artificial Intelligence has been in the spotlight over the past few months. It seems like every trade show organizer has begun exploring, validating, experimenting, investigating, and even implementing AI in their business environment.
However, as most companies seem to rely on external partners to innovate in this area, some have decided to innovate primarily “in-house” and recruit the required (scarce) talent. My goal today is to look at the various options from both sides. To do this, I want to share my own experiences as CTO of Easyfairs, as well as the experiences of some of the most innovative peers in the industry.
I must admit that I have always been fascinated by artificial intelligence. In the mid-1990s, during my university years, I worked on the first neural networks available at the time (developed by the University of Stuttgart, which still maintains a webpage that preserves the good feeling of the early 90s). The neural networks of the time were indeed very simple. The computing power we had at our disposal at the time (especially as students) did not allow us to create deep learning algorithms that are very useful today. However, we already had discriminative neural networks available that could interpret handwriting word by word, no matter how laboriously. But the seed had been planted…
This is probably why, when AI became the ‘next big thing’ in tech, we at Easyfairs decided back in 2018 to make it part of our ‘core business’. Step one: set up an internal team tasked with creating a company-wide data warehouse, collecting data from all systems, both current and historical. The goal was to have a repository that we could use to train deep learning algorithms (generative AI didn’t exist at the time) to help all aspects of our business. As a nice side effect, this gave us a repository of the ‘single version of the truth’ of our data. But that’s another topic.
Since then, we have grown our in-house team of 10 data engineers, data analysts, and data scientists, who are an important part of our technology ecosystem. We use several deep learning algorithms in our daily business, and at any given time, we have at least three to five new algorithms or generative AIs in development, testing, or proof-of-concept. Of course, not all of them will make it to “production.” But we are learning, improving, and most importantly, having fun!
Why do you choose to do this “in-house”? For me, it has several advantages:
- I work with people I like, who are part of my team and are fully committed to Easyfairs.
- We create our own experiences. We don’t share them with anyone (except you, of course).
- We didn’t hesitate to run the experiment again: the team was there anyway!
- We have a very stable team where people can learn and improve over time.
- The team is highly specialized in one area: events! They know everything about the business and they fully understand every aspect of the available data.
- Did I mention that I loved working with them?
But insourcing has its drawbacks. Here’s why other CTOs decide to outsource AI. To better understand their reasons, I asked some of the best CTOs why they decided to do it in-house or outsource.
I discussed this with a few “fellow CTOs”:
Do you develop AI capabilities with an internal team, or do you prefer to use external partners?
Nedved: That’s a good question. I think it depends on the situation… For the data platform part, i.e. collection, storage, management and governance, etc., I plan to build it in-house. For data analytics, we will do it in-house; and for data modeling, we will start by outsourcing or using external existing products. After a while, we will see 1-2 key areas that need to be outsourced in-house if it gives us a huge competitive advantage over others. For AI solution development, we will design and solve the problem in-house, closest to the business users. The development part will most likely be outsourced or done using external existing products. Governance should be done in-house.
Patrick: We develop AI capabilities within internal teams. This is especially true for engineers, data scientists, and architects where we add new incremental “AI specific roles” and FTEs. For other business functions, there are fewer incremental roles/FTEs and it’s more about training/replacing existing internal roles with people to get the most value from AI opportunities for their “existing functions”.
Alistair: We do not currently have any in-house ML modeling or AI creation capabilities. I prefer to use as much commoditized AI as possible without using internal resources. However, it is clear that because the B2B and B2C events business models are very unique, there will always be a need to create custom ML models to meet business needs.
Have you made this decision a long time ago?
Nedved: Yes, mostly. But with the advent of generative AI, things have changed a lot, and we have more opportunities to leverage products externally to improve processes.
Patrick: In principle, we decided a few years ago to have digital capabilities within the company, which we considered essential for our core business and long-term growth. However, based on dynamic developments starting in 2022/11, we decided in Q1 2023 to include AI in this “must have category”. There are many other developments in the field of digitalization that could have a very important and disruptive impact on our business in the future, which we are observing but are not investing money/significant resources in yet (e.g. Metaverse, NFTs, etc.).
Alistair: No, the pace of change in the AI industry and the development of commoditized AI tools available in platforms like AWS and GCP means that decisions around this area must remain flexible and agile.
Why make such a choice?
Nedved: We do not have sufficient funding to build a large development team. Increasing our AI modeling capabilities is very challenging given the resources required, competition for talent, and retention issues. It may be more cost-effective to allocate our limited internal resources to areas that require close interaction with business teams and where our competitive advantage lies.
Alistair: We need to first get senior leadership involved in this topic and help set the direction – this can be done internally on a small scale first – I think we need an experienced leader who can use commoditized AI platforms to conduct rapid experiments before scaling up.
Are you satisfied with your choice?
Nedved: We are currently in the early stages of integrating enterprise systems and building a data platform. More time is needed for experimentation and adaptation.
Patrick: Yes, we will have to develop additional capacities in an “evolutionary manner” anyway and can always adjust their pace / investment level here and there, because Germany lacks skilled professionals that meet our requirements anyway.
Alistair: Yes
Do you plan to change your strategy in the future?
Nedved: Yes, I hope that will change as we gain a deeper understanding of AI.
Patrick: No
Alistair: Yes – if businesses can focus on defining the actual questions they want to answer, I think this will inform the type and style of model needed, and therefore change this decision.
As you can see, there is more than one way to bake a cake.
What’s your take? Which side are you on? Team “in-house sourcing” or Team “outsourcing” (or Team “wait and see”)? I’d love to hear your thoughts in the comments!
Leave a Reply Cancel reply
You must be logged in to post a comment.