Q1 2023 Innodata Inc Earnings Call

In this article:

Participants

Amy R. Agress; Senior VP, General Counsel & Corporate Secretary; Innodata Inc.

Jack S. Abuhoff; President, CEO & Director; Innodata Inc.

Marissa Espineli; Interim CFO, VP of Finance & Corporate Controller; Innodata Inc.

Dana Buska

Timothy Clarkson; Partner, Top Producer, President of Investments & Stockbroker; Van Clemens & Co. Incorporated

Presentation

Operator

Greetings. Welcome to Innodata's First Quarter 2023 Earnings Call. (Operator Instructions) Please note, this conference is being recorded.
I will now turn the conference over to your host, Amy Agress. You may begin.

Amy R. Agress

Thank you, John. Good afternoon, everyone. Thank you for joining us today. Our speakers today are Jack Abuhoff, CEO of Innodata; and Marissa Espineli, Interim CFO. We'll hear from Jack first, who will provide perspective about the business, and then Marissa will follow with a review of our results for the first quarter. We'll then take your questions.
First, let me qualify the forward-looking statements that are made during the call. These statements are being made pursuant to the safe harbor provisions of Section 21E of the Securities Exchange Act of 1934 as amended, and Section 27A of the Securities Act of 1933 as amended. Forward-looking include, without limitation, any statements that may predict, forecast, indicate or imply future results, performance or achievements.
These statements are based on management's current expectations, assumptions and estimates and are subject to a number of risks and uncertainties and including, without limitation, the expected or potential effects of the novel coronavirus, COVID-19 pandemic and the responses of government of the general global population, our customers and the company there to impacts from the rapidly evolving conflict between Russia and the Ukraine; investments in large language models that contracts may be terminated by customers; projected or committed volumes of work may not materialize; pipeline opportunities and customer discussions, which may not materialize into work or expected volumes of work; acceptance of our new capabilities, continuing Digital Data Solutions segment reliance on project-based work and the primarily at-will nature of such contracts and the ability of these customers to reduce, delay or cancel projects; the likelihood of continued development of the market, particularly new and emerging markets that our services and solutions support; continuing Digital Data Solutions segment revenue concentration in a limited number of customers, potential inability to replace projects that are completed, canceled or reduced; our dependency on content providers in our Agility segment; a continued downturn in or depressed market conditions; changes in external market factors; the ability and willingness of our customers and prospective customers to execute business plans that give rise to requirements for our services and solutions; difficulty in integrating and driving synergies from acquisitions, joint ventures and strategic investments; potential undiscovered liabilities of companies and businesses that we may acquire potential impairment of the carrying value of goodwill and other acquired intangible assets of companies and businesses that we acquire; changes in our business or growth strategy; the emergence of new or growth in existing competitors, our use of and reliance on information technology systems including potential security regions, cyber attacks, privacy breaches or data breaches that result in the unauthorized disclosure of consumer, customer, employee or company information or service interruptions and various other competitive and technological factors and other risks and uncertainties indicated from time to time in our filings with the Securities and Exchange Commission, including our most recent reports on Forms 10-K, 10-Q and 8-K and any amendments thereto.
We undertake no obligation to update forward-looking information or to announce revisions to any forward-looking statements except as required by the federal securities laws, and actual results could differ materially from our current expectations.
Thank you. I will now turn the call over to Jack.

Jack S. Abuhoff

Good afternoon, everybody. Thank you for joining our call. As you probably saw in our announcement earlier today, we have quite a lot of exciting news to share with you. Over the last couple of weeks, we received verbal confirmation from 2 of the largest 5 global technology companies that we have been selected to provide data engineering for their innovation programs in generative AI, the technology behind ChatGPT.
One of these companies is an existing customer and the other one will be a new customer. In addition, a third company, also a new customer and another of the largest 5 global technology companies, has indicated that they are likely to choose us and we have just reached agreement with them on terms of a master services agreement. We believe these accomplishments are potentially transformative for Innodata.
For these companies, we expect to be performing a range of data engineering work required to build cutting-edge generative AI. We expect this potentially to include creating training data sets that are used to train the models, providing instruction data sets that teach the models to follow instructions, providing reinforcement learning, a process by which we align models with human values and complex use cases and providing red teaming and model performance evaluation. This involves potentially working in multiple languages and across several data modalities.
I will be focusing today's call on first: Our view of the industry landscape and the tremendous change, we believe generative AI will unleash on how businesses operate worldwide; second, how we are positioning Innodata to take advantage of this tremendous change; third, why we believe we have been able to get rapid traction with the largest tech companies as they seek to build out their generative AI capabilities and our growth drivers going forward; fourth, how we're thinking about the potential cadence of the financial impact from these wins, both this year and beyond; and fifth and last, our Q1 results and forward guidance.
We believe that generative AI will be even more transformative for businesses and consumers than the Internet. And that because of the tremendous productivity advantages, we believe the technology will provide as well as the innovative customer experience we believe it will provide. Virtually every company in the world will have to incorporate this technology to be competitive. When the Internet busted into the collective consciousness with Netscape's IPO in 1995, most people could see the Internet's potential, but the basic infrastructure, like broadband, ad servers, logistics, et cetera, were not yet been in place resulting in massive capital destruction subsequently as revenues were late in coming and monetization capabilities were slow to develop.
Of course, when that infrastructure was built over time, it created several of the most valuable companies in the world and revolutionized how business is conducted. But by contrast, in the case of generative AI, we believe the adoption cycle and benefits of adoption will potentially be much more quickly realized. We expect this will be driven by the massive productivity increases, adoption of generative AI will likely provide. Recent paper by MIT referenced to 37% increase in productivity among workers using ChatGPT versus using their legacy processes. This is an industrial revolution level leap in human productivity. The adoption of generative AI in our opinion, may do for the service sector what the steam engine and the electric motor did for the industrial sector.
We believe our immediate opportunity is with large mega cap tech companies, as we announced today that are in a race to build the generative AI foundation as well as new entrants, companies like Anthropic, Character AI and [HumAIn] that are raising massive amounts of capital to enter the race.
These companies plan on generating revenues from providing the foundational layer of generative AI and licensing that technology to third parties to build their own AI applications for their specific niches and use cases. We believe that the adoption by these licensees will potentially be very rapid, potentially much more rapid than the Internet because the application of this technology is primarily focused on enhancing productivity and because the infrastructure and ecosystem needed for its implementation is readily available and proven. Moreover, companies may likely fear that if they do not incorporate these capabilities, they're resulting in relatively higher cost structures may render them uncompetitive to say nothing of the customer experience which will seem so yesterday.
The second opportunity to do for Innodata is to work with these large technology companies, providing data engineering services to their end customers that help their end customers build solutions on these foundation models or to fine-tune their own versions of the foundation models. We see the opportunity to provide these services on both a side-by-side basis and on a white label basis. We are in active discussions with 2 of these companies about doing just that. We hope to have progress to report on this front by our next call.
The third opportunity is to help large businesses build their own proprietary generative AI models. We see this as likely becoming increasingly attractive to large businesses that have unique large data sets. We believe this will become increasingly a viable path based on the likely current evolution of 3 predictions we are making. First, that high-performing commercially usable open-source generative AI models will become increasingly available. Second, that progressively more effective techniques for model fine-tuning will become available. And third, that the marginal cost of compute will, over time, significantly reduce, thanks to innovations in GPUs and other systems.
We're already seeing evidence of this. We believe, for example, we are close to signing a master services agreement with a leading investment bank that earlier this year deployed 40 data scientists to team LLMs in the context of their data. Their vision is to enable their people to quote chat with their data, much the way that people query databases today.
There are essentially 3 main areas of cost in building generative AI; the costs of building the model, the costs of compute and the cost of data engineering. As models commoditize, fine-tuning cycles become less expensive and compute costs come down over time, we expect that large businesses will see the remaining costs, data engineering, as a high ROI investment.
Moreover, we expect it will become economic for them to address progressively more ambitious and sophisticated use cases. We believe that the data engineering work that will be required, including data collection, creation, curation, preprocessing for LLM training and testing will become the key to unlocking the potential competitive advantage that their proprietary data represents.
This transitions to our third point, why we believe Innodata has been able to gain such rapid traction with leading companies focused on this new arms race. I would say sometimes it is better to be lucky than smart. We did not predict that generative AI would burst onto the scene as quickly as it has, nor are we architects of the system, but our long history of building capabilities for other applications which are now directly applicable to generative AI, we believe has put us in an enviable position.
We haven't placed scalable domain expertise in diverse areas, including material sciences, agriculture, biology, math, legal and pharma, with thousands of civic matter experts around the world. We have a proven reputation for creating consistently high-quality data sets in complex subject areas right with subjectivity. We have a global reach, enabling us to work in 40-plus languages. We have the technical acumen we have developed over the past 7 years around AI model training and deployment. And we have developed flexible platforms to digest and ingest data to auto annotate data with zero-shot learning and to track quality metrics in real time.
We believe the technology choices we made starting back in 2016 and 2017 turned out to be tailor-made for generative AI. For example, we chose encoder-based transformer architectures that supported generative AI. At the time, this was hardly an obvious choice. While our first models only had 50 million parameters versus, for example, today's GPT-4 that is reported to have 1 trillion parameters, the essential architecture is what we settled on 7 years ago. Moreover, our experience taught us how to deploy models into real-world production environments in ways that are safe and value creating. So maybe it was not all luck.
In addition, over the past several years, we have been integrating both generative and classical AI into real-world customer operations, workflows and platforms. We have developed recipes and technologies for building and configuring fine tunable LLMs and overcoming safety and trust challenges. More recently, we have developed deep expertise in prompt engineering and prompt chaining, important skills required to deploy foundation models. As we work with our largest customers on the implementation side as we hope to and with the large businesses on a going alone basis, as we plan to, we believe our experience will both be repurposable and invaluable.
Fourth, I want to give a framework for the customer wins and positive indication of a win we announced today and broadly speak to the potential magnitude of these accomplishments and how we believe they could show up in our results. These 3 customers -- notified us over the past 10 days. As I mentioned earlier, but it's worth repeating, each of these 3 customers is in the list of the top 5 global tech firms. One is an existing customer while the remaining 2 are new. With our existing customer, we are in the process of putting in place the work order for a new win. With the 2 new customers, we are now in the process of putting in place formal agreements and finalizing initial scope.
While we are confident that agreements will be inked until this happens, there is always the chance that it does not. The deals are potentially large. To illustrate, one of the new customers has indicated that it will cut an initial purchase order for $2.5 million to get us started, but it will be supplementing that as we move forward. They also shared with us their vision for where the initial scope of work might go, which if fully realized, we believe could result in approximately $12 million of new quarterly revenues at maturity. Moreover, the draft contracts that are now being worked on are at our customers' requests, framework agreements that enable them to easily add scope.
Again, I want to emphasize that deals with the 2 new potential customers have not yet been signed, but we're expecting to get them signed and for the details to be worked out over the next few weeks. Based on our experience, we believe that these engagements will ramp up over the course of several months. Typically, once an agreement is finalized, we work with the customer to create detailed specifications and run pilots to ensure that the specifications are yielding the intended results. Oftentimes, this requires several iterations. Once the specification is locked down, our next step is to put in place the required infrastructure. This includes custom-configuring technology systems, finalizing process designs and assembling human resources, data engineers and subject to matter experts.
This can take 2 to 3 months typically. Once this is completed, ramp-up begins. We typically ramp up slowly so that we can continue to test and refine as necessary with the customer. Ramp-up itself can take 3 to 6 months to achieve steady state. The duration of the engagement will depend upon many factors, including the size of the model being built and the amount of ongoing updating and tuning that will be taking place. These are often not knowable in advance. This is a dynamic process with customer dependencies and checkpoints throughout, which makes it tough to do quarterly forecast. But based on our experience, the end result that gets us to full ramp-up is typically achieved in a roughly 12-month period.
Finally, let's talk about our Q1 results and guidance. In Q1, revenue was $18.8 million, and adjusted EBITDA was $800,000. There was no revenue in the quarter from the 3 deals we are announcing today, all of which happened in just the last couple of weeks, well after Q1 close. It is also worth knowing that there was no revenue in the quarter from the large social media company, which contributed $4.4 million in revenue in Q1 of last year, but dramatically pulled back spending in the second half of last year as it underwent a significant management change.
If we back out revenue from this large social media company, our revenue growth in the quarter over Q1 2022 would have been 12%. We believe that it is possible that business from this customer could resume in the second half, but our 2023 business plan does not account for this upside. Our 2023 business plan also does not account for revenue from the 2 new customers we have spoken about today. Even without these elements incorporated, we expect that exiting this year our revenue growth rate could potentially be high teens or in the 20s, if we back out from 2022 of the large social media company we just discussed, and that our adjusted EBITDA run rate could potentially exceed $15 million on an annualized basis.
We ended the quarter with a healthy balance sheet, no appreciable debt and $10.8 million in cash and short-term investments on the balance sheet. I'll now turn the call over to Marissa to go over the numbers, and we will then open the line for questions.

Marissa Espineli

Thank you, Jack. Good afternoon, everyone. Allow me to provide a recap of our Q1 results. Revenue for the quarter ended March 31, 2023 was $18.8 million compared to revenue of $21.2 million in the same period last year. And as Jack mentioned, the comparative period included $4.4 million in revenue from large social media company that underwent a significant management change in second half of last year, as a result, it dramatically pulled back spending across the board. There was no revenue from this company in the quarter ended March 31, 2023.
Net loss for the quarter ended March 31, 2023 was $2.1 million or $0.08 per basic and diluted share compared to net loss of $2.8 million or $0.10 per basic and diluted share in the same period last year. Our adjusted EBITDA was $0.8 million in the first quarter of 2023 compared to adjusted EBITDA loss of $1 million in the same period last year. Our cash and cash equivalents, including short-term investments, were $10.8 million at March 31, 2023, and $10.3 million as of December 31, 2022.
Again, thanks, everyone. John, we are now ready to take questions.

Question and Answer Session

Operator

(Operator Instructions) And the first question comes from Tim Clarkson with Van Clemens.

Timothy Clarkson

Jack, exciting news. This is kind of a basic question, but my customers always ask it, so I'll ask it for them. How many other companies could potentially do the kind of stuff you're doing in AI?

Jack S. Abuhoff

It's a good question, Tim. I mean, I don't have a real account on that. There are several companies that we do run up against that are talked about in the marketplace. I can tell you that on 1 of the 3 customers that who's likely winning we're announcing today, we were told that we were competing against 17 companies. And that competition has been going on for several months. It was started with 17, it got narrowed to 4. And then as I said, it's looking very good in terms of being able to call that a win and getting that signed.
So I guess there's at least 17 that are entering the fray. I expect that there will be many more. I think that there's a very exciting opportunity, which as I presented a little bit of today. And I think it's fortunate the residues somewhat of luck and some good decisions in planning that we're in the position that we are. So it's very exciting, and I believe transformative potentially for the company.

Timothy Clarkson

Okay. So kind of a follow-on question. What are the factors you think that are allowing you to get these contracts in competitive situations?

Jack S. Abuhoff

I think what it all comes down to is skills, technology, capabilities and culture. We've got a culture of being very agile, very able to move quickly and to respond to customer demand, customer centricity, understanding what people want, listening really, really super carefully to their needs and kind of embedding us as parts of that team. And then it comes to some of the things that I mentioned on the call today.
We're fortunate to have the skills ready to repurpose that are exactly what are required in order to build these technologies scalable domain expertise, the ability to create and the proven ability to create high-quality, very consistent data sets and complex subjects, 40 languages, the technical acumen at having built the recipes and the technologies to do AI model training and fine-tuning and having chosen the right architectures over the years, I think all of these are playing a great part in the track record that we're now establishing. Look, I'm like thrilled, 3 out of 3, looking very strongly. And I think we're -- I'd like to think we're just getting started.

Timothy Clarkson

Right. Supposedly, when Microsoft did their ChatGPT supposedly a lot of the annotation was done in Kenya for $3 an hour. I mean is there a difference in the kind of annotation you guys do versus that kind of annotation?

Jack S. Abuhoff

Yes, there is. There was a great deal of difference. Some of the early models that have been built kind of go very broad, but very narrow. There's not a great deal of annotation that's required. Much of what we do is the complex stuff. It's going deep into subject-matter domains. It's going deep into use cases. I think it's the next phase of what will be required, both among the large tech companies as well as other companies that are looking to own that foundation layer as well as large businesses. So we're saying, look, we've got tremendous amount of data and information that's proprietary. We're not looking to serve that data up to the foundation models via their APIs, we want to control this.
And I think that will, for them, probably make a ton of sense in light of the developments that are taking place with kind of the commoditization of the technology itself and the availability of it. All of that puts us in a great position to be able to be helpful and to be able to create great ROI on these capabilities that will probably become table stakes.

Timothy Clarkson

Right. Right. Now at what level of revenues do you think you're going to have to add employees, would it be a $25 million a quarter or $30 million a quarter or $35 million a quarter. At what level would you think you would have to add employees?

Jack S. Abuhoff

So I think it's going to depend a lot on exactly what we're going to end up being doing. There are people that are in our cost of goods, and we typically don't keep those people around in terms of services work that we do. So there are people that will be added. But I think the important thing is that that's kind of in COGS. I think from a perspective of infrastructure of core operations, we've got what we need to begin executing on these contracts. There isn't something that needs to be built. There isn't something that we need to go acquire in order to execute. So we're ready to go.

Timothy Clarkson

Sure. Now I saw that you got a line of credit. What -- how does that fit into the business situation?

Jack S. Abuhoff

Yes. So we thought that it was the right thing to do to have that in place. We were anticipating that we might win one of these large opportunities. And that it would be useful in terms of helping us manage working capital. That said, it looks like we've got potentially a great chance of having 1 off 3. And then there's more behind that, hopefully coming. So I think it's from a corporate housekeeping perspective from a corporate governance perspective, it's the right thing to have in place. And I'm very happy to say that we have it in place. And we've had great -- we have a great relationship with Wells Fargo that has started as a result of that. Very happy to be working with them.

Timothy Clarkson

Sure. Any color on Synodex and Agility?

Jack S. Abuhoff

Yes, sure. They're doing well. From an Agility perspective, we had a 4% sequential growth rate in the quarter that annualized to like a 17% growth rate. We did a lot of cost cutting there, but what we were able to do was to really hone in on and focus our resources and the best performing 50% of the sales force. They're doing wonderfully right now. We're very happy with that. For example, we just bagged a $200,000 per year subscription deal with a large automotive company in Agility. We built the PR CoPilot, the generative AI model that we launched within the platform in January. It was the first mover within that industry. We've got great strong customer reception, a super cool road map for successive releases around that this year. And we're using that to like build a strong stand-alone asset as well as a playground to be able to show other customers, well, here's what you can do. Here's the way you can integrate these technologies and think about creating value from them in a safe and trusting way today.

Timothy Clarkson

Right. What about Synodex?

Jack S. Abuhoff

Yes, sure. So Synodex, we had like -- I think it was 8% sequential growth, which would translate into a 36% annualized growth rate. A lot of the effort that we're doing this year is on expanding addressable market with new product models and new development. We're working really closely. A couple of very large life insurance companies as charter customers testing new capabilities and disability claims processing, long-term care claims processing, personal injury even and increasingly, AI is becoming an integrated feature of that capability as well.

Operator

(Operator Instructions) The next question is from Dana Buska with Feltl.

Dana Buska

What you were saying and you wrote in your press release, just sounds amazing. I have a question around the large language models with them being so new. It sounds to me like you're getting the road map as you go. And I was -- which I think is a very, very exciting place to be. And I was just wondering what type of advantages does that give you that you are over your competitors that you're working with these large companies being, should we say, the first mover doing this type of stuff?

Jack S. Abuhoff

Yes. I think it gives us potentially a tremendous advantage. And again, we see opportunities kind of on 2 sides of the fence here. One is helping companies build these models and the second is helping them deploy and utilize the technology. So when the work that we're doing with the large tech companies, we're getting exposed to all sorts of new technologies. We're creating -- we're seeing things before they -- well before they come out potentially. And that gives us the ability to help companies kind of think through how they would deploy things.
In addition, as we go about these tasks, we're building new capabilities. We're going to be building new technologies and new systems and through that process, identifying weaknesses in existing technologies and trying to innovate around that. So I think with everything that we do, we become stronger and we become better. My prediction that is behind these large tech companies, there's another 40 either well-funded start-ups or large tech companies who are going to be closely following. We're having conversations with several of them.
And then beyond that, I think companies that are large companies with significant proprietary information assets will over time, have the ability and be able to make a use case for building LLM themselves. And again, I think we'll be very well positioned to partner with them to do that. So that we're winning these are critical. It will accelerate our innovation. It will accelerate our capabilities and make us more valuable progressively as we go from there.

Dana Buska

Excellent. That also sounds wonderful and amazing I recently saw a paper that was out of Stanford talking about being able to use large language models to do reinforced learning. Is that something that you think would potentially be some type of issue for you? Or do you do not consider that to be a threat to your business?

Jack S. Abuhoff

Yes. I missed the term that we're using Stanford study that shows that confused for what exactly?

Dana Buska

Stanford University that large language models could be used to do reinforced learning?

Jack S. Abuhoff

Reinforcement learning?

Dana Buska

Yes, reinforcement learning, sorry.

Jack S. Abuhoff

Yes. So reinforcement learning, sometimes called RLHF is basically used as part of one of the processes to build the model and to fine-tune the models. You're basically comparing the output of language model to human-generated pairs of questions and answers, and then you're modifying the internal variables to favor the responses that are similar to the human responses. So that's inherent process in the model fine-tuning itself.

Dana Buska

Okay. And then when we start -- when you start looking at building like your prompt engineering practice, do you have those people on staff right now? Or is that some place where you're going to have to start hiring people. How do you look at creating these new -- almost new jobs that you're going to need these new positions that you're going to need to fulfill your ambitions?

Jack S. Abuhoff

Yes, it's a great question. We're finding that -- these skills do not exist in surplus in the world. I was showing several weeks ago, a job ad for hiring a prompt engineer for $250,000 a year, which I don't know if that was real or not. But it made its way around the Internet. We're taking people that have been with us for a long time within our engineering group, they're figuring out prompt engineering. There's no real book written about prompt engineering or prop management or prop chaining, but we're building techniques. And one of the great things about our company is we've got so many real-world applications of these technologies. So it's not academic, it's not theoretical. We're like figuring out how do you deploy this safely and in a trustworthy way into real-world production scenarios. And from that, there's a tremendous amount of learning that we're capturing and can repurpose for our customers' continued benefit.

Operator

We have reached the end of the question-and-answer session, and I will now turn the call over to Jack for closing remarks.

Jack S. Abuhoff

Operator, thank you. So yes, I'll quickly recap. Today, we're announcing that we have received verbal wins from 2 of the top 5 global tech companies and that we have a strong indication of a win from a third of these 3 companies, 2 our new customers. These are all companies that are widely expected to forge the path forward in generative AI development over the next several years.
Last quarter, we cautioned you that these were just pipeline, and the pipeline often does not close this quarter. We're proud to say that 2 are now verbally confirmed wins and that we got a strong indication from the third that it also is likely a win. We hope to have each of these papers in the next few weeks.
We believe that these new deals, perhaps individually, but certainly in the aggregate, present a potential transformative opportunity for our company. As you know, we have a solid track record of land and expand with large tech companies. And now with the additional tailwind of generative AI, we think we are extraordinarily well positioned.
We believe the revenue growth opportunity with these companies is significant in the near, medium and long-term perspectives. We're also excited about the endorsement. We believe these new wins and accomplishments represent for us. We believe virtually every company out there will need to become an AI company over the next several years, and we believe that we will be well positioned to help them do just that.
I also want to say that we plan on stepping up our Investor Relations activities in the second half of the year. We plan to be presenting at several investor conferences, and we will announce these once plans are firmed up. So again, thank you, everybody, for joining us today. We'll be looking forward to our next call with you.

Operator

This concludes today's conference, and you may disconnect your lines at this time. Thank you for your participation.

Advertisement