Adobe Inc. (NASDAQ:ADBE) Q3 2023 Earnings Call Transcript - Page 4 of 5 - InvestingChannel

Adobe Inc. (NASDAQ:ADBE) Q3 2023 Earnings Call Transcript

Alex Zukin: Hey, guys, thanks for taking the question. And I guess a two-parter on generative AI. First just around the pricing model for generative credits, obviously, very progressive. And I guess how should we think about as we go forward? And you’ve seen at least on the hobbyist side, some of the usage of Firefly and within Express. How do we think about for the two cohorts of both obvious and professional creatives? How many credits are kind of a typical user or use case likely to drive in a given month? And to the question that you got kind of about Document Cloud and the impact of generative AI on that business, is it fair to think that given the pricing that’s been announced, the next wave of incremental innovation, is that going to come more from a new product availability, or should we think about more pricing enabled levers to come?

Shantanu Narayen: I think, Alex, on the first question associated with how did we think about pricing, I mean, first, it’s important to remember the breadth of all of the segments that we serve. In other words, how we think about K-12 all the way to the largest enterprise in the world. And I think it’s fair to say that philosophically, we wanted to drive more adoption. And therefore, the pricing as it relates to what’s included in the at least short run for Firefly subscriptions, Express subscriptions, GenStudio in terms of how much they can get within an enterprise is going to be the bulk of how we recognize the ARR. And I think getting that adoption and usage is where the primary focus is going to be in terms of the new user adoption, as well as for existing customers, the pricing upgrade.

So that’s how we think about it. We certainly need the ability to have the generative packs, but I think just getting everybody exposed to it. One of the real innovations that we did that’s driven tremendous uptake in that is what we’ve called this context aware sort of menus within Photoshop. So, it’s so front and center, you’ll start to see that being rolled out in all of the other applications. So that’s sort of the focus. Let’s get the core subscriptions, let’s get all of them exposed to it, and let’s make sure that we’re covering what we need to by the pricing actions that we took. So that was sort of the focus. I think on the Doc Cloud part and how we look at it, to add to again what David said, I mean, some of the things that people really want to know is how can I have a conversational interface with the PDF that I have?

Not just the PDF that I have open right now, but the PDF that are all across my folder, then across my entire enterprise knowledge management system, and then across the entire universe. So, much like we’re doing in creative, where you can start to upload your images to get — you train your own models within an enterprise, [indiscernible]. The number of customers who want to talk to us now that we’ve sort of designed this to be commercially safe and say, hey, how do we create our own model? Whether you’re a Coke or whether you’re a Nike, think of them as having that, I think in the document space, the same interest will happen, which is we have all our knowledge within an enterprise associated with PDFs. Adobe helped me understand how your AI can start to deliver services like that.

So, I think that’s the way you should also look at the PDF opportunity that exists, just more people taking advantage of the trillions of PDFs that are out there in the world and being able to do things. The last thing maybe I’ll mention on this front, Alex, is the APIs. So part of what we are also doing with PDFs is the fact that you can have all of this now accessible through APIs, it’s not just the context of the PDF, the semantic understanding of that to do specific workflows. We’re starting to enable all of that as well. So, hopefully that gives you some flavor. You’re right, the generative credits has been designed to more for adoption right now, but we also wanted to make sure that at the high end, we were careful about how much generative credits we allow.

David Wadhwani: Yeah. And just one thing to add to that, Alex, is that one of the things we did, first of all, it was a very thoughtful, deliberate decision to go with the Generative Credit model and the limits, as you can imagine, we’re very, very considered in terms of how we set them. The limits are, of course fairly low for free users. The goal there is to give them a flavor of it and then help them convert. And for paid users, especially for people in our single apps and all apps planned, one of the things we really intended to do was try and drive real proliferation of the usage. We didn’t want there to be generation anxiety put in that way. We wanted them to use the product. We wanted the Generative Fill and Generative Expand.

We wanted the vector creation. We wanted to build the habits of using it. And then, what will happen over time as we introduce 3D, as we introduce video and design and vectors, and as we introduce these Acrobat capabilities that Shantanu was talking about, the generative credits that are used in any given month continues to go up because they’re getting more value out of it. And so that’s the key thing. We want people to just start using it very actively right now and build those habits.

Alex Zukin: Super clear. Super thoughtful. If I could sneak one in for Dan. Of the $520 million in the net new ARR for Q4, just roughly, you’ve talked about before having some impact from the generative AI product that you were going to launch this year. Is it fair to assume it’s very minimal in that $520 million?

Dan Durn: Yeah, I would say it’s modest impact to the business in Q4. And again, a quarter from now when we give our FY 2024 targets, we’ll have more to say of what that looks like going forward, but modest impact in Q4.

Alex Zukin: Perfect. Thank you, guys.

Operator: And we’ll take a question from Brad Zelnick with Deutsche Bank.

Brad Zelnick: Great, thanks very much. David, you talked about making Firefly APIs available to customers to embed Firefly into their own content creation and workflows. Can you talk about the use cases and monetization? And is this something you foresee partners leveraging as well into their own third-party offerings?

David Wadhwani: Yes, absolutely. Our goal right now is for enterprises and third parties that we work with is to provide a few things. The first is this ability, obviously, to have API access to everything that we are building in, so that they can build it into their workflows and their automation stack. The second thing is to give them the ability to extend or train their own models as well. So, as we mentioned earlier, our core model — foundation model is a very clean model that generates great content and you can rely on it commercially. We want our customers and partners to be able to extend that model with content that is relevant to them so that Firefly is able to generate content in their brand or in their style. So, we’ll give them the ability to train their own model as well.

Related posts

Advisors in Focus- January 6, 2021

Gavin Maguire

Advisors in Focus- February 15, 2021

Gavin Maguire

Advisors in Focus- February 22, 2021

Gavin Maguire

Advisors in Focus- February 28, 2021

Gavin Maguire

Advisors in Focus- March 18, 2021

Gavin Maguire

Advisors in Focus- March 21, 2021

Gavin Maguire