Divide and conquer: improving contribution and evolution flow

Coordinator
Jun 25, 2013 at 10:35 PM
I've been thinking lately a lot about the kinds of contributions that we'd like to receive and how we want the community to engage with NuPattern. Some of these thoughts were expressed in the discussion about the new DI container for the runtime internals. But it deserves a proper thread.

The experience I got when showing NuPattern at the Outercourve conference, was that people really see the power of NuPattern when they see that there's a lot of automation that can already be done without any coding at all: templates, codegen, general VS automation stuff, user interaction, etc.

Having an ever growing and varied library of shared and community-fostered automation components is key, and we should have the most streamlined experience as possible for contributing to it. Currently, we're in a pretty dire state: users are expected to fork and clone a huge repository that contains the entire NuPattern stuff, including HoLs and documentation, open a solution containing 23 projects at the moment (this adds significantly to the build/test times), and then wait for a full NuPattern runtime/authoring release before their contributions see the day of light.

Clearly, the current approach fosters authoring automation in the local project, together with the toolkit, to avoid all this. Even when the created automation is general-purpose and justifies contributing it to NuPattern.

Another entirely different tier of contributors are those fixing or improving the core runtime or authoring.

The last big refactoring by Jezz polished the entire solution dependencies and areas, so it's perfectly possible (I think) to consider a more involved split.

For the Library itself, there's a key combination of services that makes for a very engaging contributor experience:
  • GitHub repository: not only Git (we have that in Codeplex), but the ability to merge directly from the website, is truly fantastic and changes the dynamic for project admins with regards to contributors. You can accept contributions even from your phone.
  • NuGet package: by offering the library as a nuget package, taking a dependency on it (as well as shipping updates to it) is just a matter of publishing to nuget.org. Developers are increasingly adopting and expecting NuGet distribution for reusable libraries. No heavyweight and involved downloads and user/machine-wide VSIX installs. This makes it possible to use multiple versions of the library in different toolkits as needed (i.e. no need to force everyone to update). This requires runtime support as explained in the DI thread).
  • MyGet: combined with GitHub, you can set up this service so that it automatically builds the library nuget package on every commit to the repository. Meaning when a contribution is accepted (even via the cell phone ;)), the contributor only has to wait a couple minutes to be able to use his contributed component in his toolkit. This is key. This does not mean that all contributions end up generating a new release on nuget.org, since myget an also serve as a "CI NuGet Repo" instead and contributors an reference that repo in addition to nuget.org for their updates. When sufficient contributions justify a new public nuget.org release, one click on myget.org is enough to publish
  • SymbolSource: sometimes these reusable components don't work as we expect. Being able to debug them is key in understanding how they work and what's supported and what isn't. This even helps foster contributions since you can spot what needs to be improved if something doesn't work the way (or to the extent) you expect. MyGet can automatically push symbols and sources for the built package to symbolsource, and Visual Studio is very easily configurable to pull both from it and offer an integrated debugging and source stepping experience.
This means that if done properly (I'm doing this with Clide), from a sent pull request to a new release+debug+sources debugging experience it's at most a 5-10' cycle. And toolkit authors can release their toolkit without waiting for NuPattern to release anything beyond the CI feed giving them the updated library package. Not even a new release to nuget.org.


As for the other NuPattern pieces, keeping two entirely separated repos (GitHub AND CodePlex) could add confusion. At this moment, GitHub offers more features in terms of project hosting than CodePlex, and it could be a long time before we have anything close here. The wiki and website (with custom domain!) hosting they offer is unparalleled, and for the Library it's quite key, since it would be almost a must to be able to auto-generate components' documentation for potential users to browse and see what's supported. We simply can't have that in CodePlex.

The library depends on Common.* and NuPattern.Runtime.Extensibility, meaning those would probably need to come in a nuget package too. They cannot remain part of the Runtime solution, I think, since the runtime also depends on the library, and we'd have a weird kind of circular reference there. Actually, we have to revisit that assumption, since the dependency on the library comes from the fact that previously, we could only have one library installed, and we chose to distribute it with the runtime... Some of the instantiation template wizards and maybe a few other components should move to the runtime for distribution purposes, but it may be possible to completely decouple it.

At that point, we could have the following repos/artifacts:
  • NuPattern.Runtime: containing the core implementation internals
  • NuPattern.Authoring: tooling, guidance and HoLs
  • NuPattern.Common: containing all Common.* as well as Runtime.Extensibility, and generating a single nuget package which may or may not be used by NuPattern.Runtime. It's almost certainly used by authoring (just like the library).
  • NuPattern.Library: depending on NuPattern.Common package, and generating a single nuget package used by authoring and all toolkit authors, who would indirectly get the Common package.
I'm not sure about the redist story with Common. It may be that we can simply rely on NuPattern runtime to ship that, and have the library authoring experience somehow compile against the "current" version. myget.org needs to be able to pull all dependencies at build time, which is why I was thinking about having a nuget package for that piece too. We need to think a bit more about that.
Coordinator
Jun 26, 2013 at 4:21 AM
Thinking about this more too.
I'll focus on the NuPattern/Library developers experience of this, as well as the toolkits builder experience.
Coordinator
Jun 26, 2013 at 3:30 PM
Yes, indeed.
So maybe Runtime+Authoring should be a single repo.
I'm still not convinced we can avoid the NuPattern.Common repo, since that's a dependency both the library and the Runtime+Authoring have (with these also depending on Library). Kinda unavoidable I guess...
Coordinator
Jul 4, 2013 at 4:13 AM
I've spent quite some time on this refactoring. It's almost done, but I came back to this key question: once library can be distributed via nuget (to avoid full NuPattern releases just to get a new command, say), this entire endeavor will depend on the new DI approach being completed first (see https://nupattern.codeplex.com/discussions/447027).

Since I'm not sure I'll be able to finish that huge undertaking, I'm questioning now the need for this too. Maybe we should just see how this goes and leave it for later if needed at all in the end?