Docs as code (2017)
Documentation as Code (Docs as Code) aligns documentation creation with coding practices, emphasizing collaboration, integration with development, and tool consistency. Write the Docs community advocates this approach through talks, successful industry examples, recommended books, and an open-source toolchain for effective implementation.
Read original articleDocumentation as Code (Docs as Code) advocates for writing documentation using the same tools as code, such as issue trackers, version control, plain text markup, code reviews, and automated tests. This approach fosters collaboration between writers and developers, ensuring better integration with development teams and incentivizing developers to contribute to documentation. Write the Docs community has been actively promoting Docs as Code through talks and conferences since 2015, showcasing successful implementations in various organizations like Google, Rackspace, and Amazon Web Services. The concept is gaining traction in the software industry and writing community, with presentations at different conferences worldwide. Recommended books on the topic include "Docs Like Code" by Anne Gentle and "Modern Technical Writing" by Andrew Etter. The community also offers an open-source toolchain, docToolchain, to implement the Docs as Code approach effectively. This methodology aims to streamline documentation processes, empower documentarians, and enhance the overall quality of technical documentation.
Related
Documentation Driven Development (2022)
Documentation Driven Development (DDD) is proposed as a more effective approach than Test-Driven Development (TDD) for software development. DDD involves starting with documentation to iron out implementation details before coding, helping to address API shifts and scope misunderstandings early on. By documenting requirements and potential future API changes, developers can better plan and refine their code, avoiding costly refactors later. DDD emphasizes the importance of various forms of documentation, including design mockups, API references, and tests, to communicate thoughts and guide development. This method encourages a feedback loop on APIs and work scope, enhancing code quality and project outcomes. DDD aligns with concepts like Behavioral Driven Development (BDD) and Acceptance Test-Driven Development (ATDD), emphasizing user behavior validation and strong communication practices. By incorporating DDD into their workflow, developers can improve collaboration, goal refinement, and code quality, leading to more successful project outcomes.
Software Engineering Practices (2022)
Gergely Orosz sparked a Twitter discussion on software engineering practices. Simon Willison elaborated on key practices in a blog post, emphasizing documentation, test data creation, database migrations, templates, code formatting, environment setup automation, and preview environments. Willison highlights the productivity and quality benefits of investing in these practices and recommends tools like Docker, Gitpod, and Codespaces for implementation.
Git Workflows for API Technical Writers
Technical writers encounter challenges in maintaining API documentation aligned with evolving designs. Git workflows provide version control and collaboration benefits. Strategies like merging copy improvements, code annotations, and OpenAPI overlays enhance documentation quality. Involving writers in API design ensures user-centric content. Storing docs in separate repositories can resolve Git conflicts.
Readability: Google's Temple to Engineering Excellence
Google's strict readability process involves code approval by maintainers and readability mentors, shaping coding standards. Despite criticism, it enhances skills, maintains quality, and fosters global code consistency. A proposed "Readability Lite" aims for mentorship and quality without strict enforcement.
Self Documenting Code Is Bullshit
Klaus Breyer challenges self-documenting code, advocating for external documentation to enhance precision and abstraction. He emphasizes the need for detailed information like variable units and invariants, promoting a balanced approach for code clarity.
I'm talking about API documentation here - for both code-level APIs (how to use these functions and classes) as well as HTTP/JSON/GRPC/etc APIs that the codebase exposes to others.
If you keep the documentation in the same repo as the code you get so many benefits for free:
1. Automatic revision control. If you need to see documentation for a previous version it's right there in the repo history, visible under the release tag.
2. Documentation as part of code review: if a PR updates code but forgets to update the accompanying documentation you can catch that at review time.
3. You can run documentation unit tests - automated tests that check that the documentation at least mentions specific pieces of the code (discovered via introspection). I wrote about that a few years ago and it's been working great for me: https://simonwillison.net/2018/Jul/28/documentation-unit-tes...
4. Most important: your documentation can earn trust. Most documentation is out of date and everyone knows that, which means people default to not trusting documentation. If anyone who looks at the commit log can see that the documentation is being actively maintained alongside the code it documents they are far more likely to learn to trust it.
The exception to this rule for me is user-facing documentation describing how end users should use the features provided by the software. I'd ideally love to keep this in the repo too, but there are rational reasons not to - it might be maintained by the customer support team who may want to work in more of a CMS environment, for example.
So they are written and adjusted as the functionality is implemented, they can be reviwed alongside the code PRs. The CI builds the docs and makes sure there are no issues.
The format is a custom one, which is parsed and converted into JSON for language servers and into the API website. Not sure how you‘d test the docs content, but this parser is tested for sure.
Works great for us in general.
While it is possible to do okay anyway, it only happens if there is effort over time.
I'm convinced that the best thing to do it when someone asks you a question about your APIs the response should be to go (now that you are not so close to the code you can better to this) write the answer that person needs, and have them review it until they understand. You are not allowed to talk to that person except via new documentation, while they can pester you as much as they want until you make the documentation usable. It will still take some rounds, but if nobody is reading the documentation there is no point in writing it either.
To do bug fixes, one simply updates the docs to explain the new behavior and intentions, and perhaps include an example (ie. unit test) or a property. This is then reflected in the new version of the codebase - the codebase as a whole, not simply one function or module. So the global refactoring or rewrites happen automatically, simply from conditioning on the new docs as a whole.
This might sound breathtaking inefficient and expensive, but it's just the next step in the long progression from the raw machine ops to assembler to low-level languages like C or LLVM to high-level languages to docs/specifications... I'm sure at each step, the masters of the lower stage were horrified by the profligacy and waste of just throwing away the lower stage each time and redoing everything from scratch.
There is value in the larger organization being able to consume documentation and commenting on it and contributing to it.
There is conceptual value in some of these things, but I find it to be overstated and the downsides entirely ignored.
Most documentation systems have a version history.
And most documentation systems are far easier adopted by people other than engineers.
This is the equivalent of pointing out that figma has x, y, and z benefits and designers are fluent in it, so we should be using that for documentation.
How do you write automated tests for documentation? Somehow require that blocks of code have documentation linked to them?
For example, in code, you can generally use feature flags, A/B testing, etc. to show different things to different people quite flexibly, but (depending on how the documentation is actually published) you might have very different capabilities.
But that's not really treating your docs as code, more like "storing your docs in the same place as your code." A system like Sphinx with autosummary and autodoc where the docs are generated from your code and human-readable details like examples are pulled from the relevant docstrings is very much docs as code. Same with FastAPI's automatic OpenAPI generation and automatic Swagger. Pulling the examples section for your functions directly from your tests, now that's docs as code.
We are close, but it's not there yet. I will always need to run a validation test against the code and eyeball to ensure it's not insane. But today, it's clear to me that if ChatGPT/copilot doesn't generate correct code quickly and easily from what I wrote, I didn't understand the problem and couldn't express it clearly.
I agree having a shared doc repo so anyone can commit changes/patches to docs, witch is well... Not much different than what most wikis offer already, and while useful wikis prove that's not enough to have good docs and little to no garbage in them...
But I will NEVER "host my docs" on someone else platform depending from their services (if you host code/docs as a mere repo, GH and alike are just mirror of something most dev have, if you use their features your workflow hardly depend on them) and I also never use MD as my default choice.
Version control is the best for documentation.
But maintaining it is hard - lots of great comments here.
For anyone interested, I'm working on https://hyperlint.com/ (disclaimer: bootstrapped founder). To help automate the toil around documentation.
When the same document is edited by two separate individuals and diverges, it is a nightmare to reconcile the two.
I truly wish (i.) Microsoft Word was a nicer format for VCS, or (ii.) Markdown was more suitable for “formal” legal texts and specifications — probably in that order (!)
For this to work well (not just like an API reference), the implementation itself had to be structured well.
I'm going to head all this off at the pass, and say instead that DaC[1] is a technological tool for a limited number of business use cases. It's not a panacea, no more than XML publishing in a CCMS (component content management system) was seen as the Alpha and the Omega (and indeed still is by a whoooooole lot of people). I say this as a heartfelt believer in the DaC approach vs a big heavy XML approach.
Your first question - really, this should always be your first question - is, "how do people do their jobs today?". If you work in a broom factory, and the CAD guy reads word documents, the pubs guys use Framemaker, the reviews are in PDF, and the final delivery is a handful of PDF documents....well, using DaC is going to be a jump.
Now, is that jump worth it? Well, it might be. Your CAD guy might know his way around gitlens, your pubs folks probably have some experience with more complex publishing build systems, and, most important of all, you might have a change tempo that really recommends the faster-moving flows of DaC. If you're going the Asciidoc route, you could even try out some re-use via the `include` and `conditional` directives. But it also could be a disaster, with no one using VCS, no one planning out re-use properly[2], people passing reviews around in whatever format, and PDF builds hand-tooled each time. It's not something you dive into because it's what the cool kids are doing. Some places, maybe even most places legacy industry wise, it's just not going to work. Your task - if your job is consulting about such things - is to be able to read the room real fast, and recognize where it's a good fit, and where you might need to back off and point to a heavier solution.
[0] Big traditional XML publishing systems are also in the crosshairs, as they're quite frankly usuriously expensive, also writer teams have started noticing the annoying tendency of vendors to sell a big CCMS and then - once the content's migrated - completely disappearing, knowing that the costs of migration will keep you paying the bill basically forever.
[1] DaC defined as : lightweight markup (adoc, md, rst, etc), written/reviewed with a general-purpose text editor, where change/review/publish is handled on generic version control (git, hg, svn, etc), and the consumable "documents" are produced as part of a build system.
[2] Which crashes ANY CCMS, regardless of how expensive or how DaC-y it is.
I thought it was gonna be all about ensuring your api documentation is closely coupled with your code. But it's more about using code tools to write docs.
I'm kinda two ways on it, doesn't it depend on what "docs" actually are? (I couldn't find a definition on the page). Wikipedia is a kind of documentation, but tieing it to version contril tools would massively restrict the number of people contributing and therefore the quality if the docs.
I dunno, maybe I'm missing the point.
First, code is formal language, and docs are natural language. That's a lot of jargon; what does it mean? It means that the chunks inside of a piece of code are consistently significant; a method is a method, a function is a function. Chunks in a document are, woo boy, good luck with that one. XML doesn't even have line breaks normalized. Again, no matter what the XML priesthood natters about, it's natural language.
A consequence of this is that the units of change are much, much smaller in a repo of code vs a corpus of documents. This, well, is can be ok, but it also means that a PR in a docs as code arrangement can be frickin' terrifying. What this means, is that you have to have a pretty good handle on controlling the scope of change. Don't branch based on doc revisions, but rather on much more incremental change, like an engineering change order or a ticket number.
Your third problem is that the review format will never - can never - be completely equivalent to the deliverable. The build process will always stand in the way, because doing a full doc build for every read is too much overhead for basically any previewer or system on the planet. This is a hard stop for a lot of DaC adopters, as many crusty managers insist that the review format has to be IDENTICAL to the format as it's delivered. Of course, that means when you use things like CIRs (common information repositories) that you end up reviewing hundreds of thousands of books because an acronym changed....but I call 'em "crusty" for a reason. They're idiots.
I think the amount of effort you should put into documentation varies wildly on the scope of the project.
Related
Documentation Driven Development (2022)
Documentation Driven Development (DDD) is proposed as a more effective approach than Test-Driven Development (TDD) for software development. DDD involves starting with documentation to iron out implementation details before coding, helping to address API shifts and scope misunderstandings early on. By documenting requirements and potential future API changes, developers can better plan and refine their code, avoiding costly refactors later. DDD emphasizes the importance of various forms of documentation, including design mockups, API references, and tests, to communicate thoughts and guide development. This method encourages a feedback loop on APIs and work scope, enhancing code quality and project outcomes. DDD aligns with concepts like Behavioral Driven Development (BDD) and Acceptance Test-Driven Development (ATDD), emphasizing user behavior validation and strong communication practices. By incorporating DDD into their workflow, developers can improve collaboration, goal refinement, and code quality, leading to more successful project outcomes.
Software Engineering Practices (2022)
Gergely Orosz sparked a Twitter discussion on software engineering practices. Simon Willison elaborated on key practices in a blog post, emphasizing documentation, test data creation, database migrations, templates, code formatting, environment setup automation, and preview environments. Willison highlights the productivity and quality benefits of investing in these practices and recommends tools like Docker, Gitpod, and Codespaces for implementation.
Git Workflows for API Technical Writers
Technical writers encounter challenges in maintaining API documentation aligned with evolving designs. Git workflows provide version control and collaboration benefits. Strategies like merging copy improvements, code annotations, and OpenAPI overlays enhance documentation quality. Involving writers in API design ensures user-centric content. Storing docs in separate repositories can resolve Git conflicts.
Readability: Google's Temple to Engineering Excellence
Google's strict readability process involves code approval by maintainers and readability mentors, shaping coding standards. Despite criticism, it enhances skills, maintains quality, and fosters global code consistency. A proposed "Readability Lite" aims for mentorship and quality without strict enforcement.
Self Documenting Code Is Bullshit
Klaus Breyer challenges self-documenting code, advocating for external documentation to enhance precision and abstraction. He emphasizes the need for detailed information like variable units and invariants, promoting a balanced approach for code clarity.