Recently, we helped one of our clients deploy their product to Microsoft’s online app store for business applications: AppSource. Specifically, our client’s product was a Dynamics 365 plugin. None of us had ever personally gone through the process of deploying a product to AppSource before, and our initial research into the topic quickly revealed that the road forward was far from paved. Sure, on the surface it sounded simple enough; Microsoft provides a quick 2-pager on what their technical expectations for any submitted product are, followed by 4 seemingly straightforward steps that would see us from start to finish, but as soon as we dug past the surface level, we encountered some uncertainty.
Our client’s product already met the majority of the technical requirements outlined in Microsoft’s documentation, but where things got murky was in the requirement that it be an “easily configurable, turnkey solution.” The documentation seems to state flatly that any app that requires more than pressing “Install” wouldn’t be accepted, which was problematic for us, as our client’s product required the deployment of a number of resources external to Dynamics to function properly. However, based on our client’s past experience with a previous AppSource deployment they had gone through, we suspected that the configuration steps our client’s product required would be more acceptable than Microsoft’s verbiage implied.
What type of app you’re submitting informs what additional requirements are involved in the submission process. For Dynamics 365, most of these additional requirements are fairly straightforward, but the one that left us scratching our head a bit was the Scenario and Use Case document. This document is supposed to function as an introduction for the validation team into your submission. Again, on the surface it seems fairly straightforward, but there’s little in the way of supporting documentation around what Microsoft is looking for from this document. Individual fields within the form are often defined vaguely at best, and what level of detail the validation team is looking for is never offered. After a some digging, we ultimately determined definitions for what type of information Microsoft is looking for in each field. Based on example submissions we were able to track down from others in the Dynamics community, it didn’t need to be too detailed, just a high-level overview.
For those interested, I’ve attached a version of Microsoft’s Scenario and Use Case template here with definitions of what they’re looking for in each field below.
Now that we had all the technical details and supporting documentation worked out, we finally submitted everything for review. As noted in the Dynamics 365 requirements link above, Microsoft requires that your Dynamics package be included with the request. At this point, they’ll perform a code review to see if anything immediately jumps out. They emailed us back about 3 days later, stating code scans had discovered a few performance concerns, so we needed to resolve those and re-submit. Additionally, they wanted to set up a meeting to go over a few chunks of code they had some questions about.
That first meeting was relatively brief and straightforward. It was largely security-oriented; they had some concerns about the way we were doing a few things and were looking for justification for some bits of code we had in our plugin and package deployment code. By the end of the meeting, we were able to alleviate these concerns, which left us only with the need to resolve the performance issues that their code scans had called out.
After updating our code and re-submitting, we had to wait another 4 days (2 of these days were holidays) before we heard back from them again, at which point they wanted to set up a meeting with one of their validation engineers and stakeholders from our side for a more comprehensive review of the product.
In the weeds
Going into the meeting, we had very little idea of what they were going to want to discuss, but we guessed it would be a few clarifying questions and additional details about the topics covered in the Scenario and Use Case document and it wouldn’t get too much more involved than that. What actually resulted was a deep-dive that lasted over 2.5 hours. The Microsoft representatives asked detailed questions about nearly every facet of deployment and usage. Additionally, they had us walk through a number of chunks of code with them, explaining both function and rationale for design. By the end of it, we had provided a fairly in-depth review of most of the product’s critical components. Based on this experience, we’d suggest including team members with solid knowledge of the product and its use cases, not simply ones who understand the technical components. The Microsoft representatives will be looking to gain a comprehensive understanding of both the business and technical aspects of your submission and it’s critical that your representation in this meeting be able to provide this. It may be a worthwhile experiment to try front-loading as much information as possible into the Use Case Scenario document to see if that could answer more of their questions up-front. As for ourselves, we went pretty light with detail on that front and that may have been a contributing factor in how involved that meeting ended up being.
Following that meeting, is was more or less smooth sailing for us. About a week after that 2nd meeting, our submission was approved, and once we had internally sorted out some marketing details, it was as simple as clicking “Publish.”
All told, it proved a mostly straightforward process sans a couple of steps that had a fair amount of uncertainty around them. Thankfully, Microsoft’s representatives were pretty quick to give us feedback and let us know about problems with our submission. Hopefully, this overview of our experiences and pain points can prove of some use to any of you out there trying to submit your own product to AppSource.