Coder Social home page Coder Social logo

Comments (12)

chanmosq avatar chanmosq commented on August 29, 2024 2

I've read and support this RFC proposal.

It's really important for docs to have a feature grid that can accurately portray the docs' health status across O3DE's various domains - I think this is a great idea. Additionally, it brings clarity to the overall state of O3DE Docs and can help identify any gaps.

Upon acceptance, work for this RFC should be supplemented with defining the levels of docs "quality". That would help sigs make a more informed judgement of the state of their feature docs.

from sig-docs-community.

chanmosq avatar chanmosq commented on August 29, 2024 1

Requesting feedback from @o3de/sig-release, and suggest bringing this up to TSC for awareness. Since this depends on other sigs to report on the docs health status of their own features, and is involved with the release process, we would like your review on this.

The last day for feedback is Friday, Oct. 28.

from sig-docs-community.

vincent6767 avatar vincent6767 commented on August 29, 2024 1

I've read and supported this proposal. I suggest asking individual SIGs for feedback as this adds another responsibility and coordination effort with the SIG Docs community.

from sig-docs-community.

chanmosq avatar chanmosq commented on August 29, 2024 1

In response to @vincent6767, we will reach out to other sigs and extend the date for processing this RFC.

from sig-docs-community.

willihay avatar willihay commented on August 29, 2024

I've read and support this proposal. Getting sig involvement in the evaluation of documentation relevant to their feature areas seems like a smart idea.

A couple questions on the definitions of the columns:

  1. For tutorials you include "samples" in the description:

    Tutorials: Examples, samples, and walkthroughs designed to help users with the specified feature.

    "Samples" means something very specific, and has its own column. I'd remove that from the description.

  2. Do you include as part of the tutorials category the how-to topics that are often embedded in the feature documentation? I am in agreement with this, because they both involve practical steps, but it's not 100% clear to me if this is the intention, and since this category is named "tutorials", the how-to topics might be overlooked when sigs evaluate the docs. You might want to explicitly state that tutorials can include any procedural topics in any of the various guides, not just the Tutorials ("learning guide") section. You could call this category "Procedural docs".

from sig-docs-community.

amzn-rhhong avatar amzn-rhhong commented on August 29, 2024

Do we have a score/feedback system on each doc page? Something like "is this article helpful?" Or "on scale of 1-5 how do you think about this article"

I'm thinking pulling data like that and use it to generate the health matrix for docs is going to really reflect how customers think.

Could be as simple as thisimage

from sig-docs-community.

chanmosq avatar chanmosq commented on August 29, 2024

@amzn-rhhong Having some sort of customer assessment on the docs' quality is a good idea and something the sig should investigate in. Perhaps this is a conversation for outside of the RFC, because the feature grid is intended for sigs to report on the status of docs. And at this time, the "quality" that we report on will be based on an internal assessment of "Do we have some learning content (docs, videos, or samples) such that a user can use the feature?"

from sig-docs-community.

FiniteStateGit avatar FiniteStateGit commented on August 29, 2024
1. For tutorials you include "samples" in the description:
   > Tutorials: Examples, samples, and walkthroughs designed to help users with the specified feature.
   
   "Samples" means something very specific, and has its own column. I'd remove that from the description.

Agreed, "samples" in the definition of tutorials could be substituted with how-to's, topics that demonstrate a procedural workflow.

2. Do you include as part of the tutorials category the how-to topics that are often embedded in the feature documentation? I am in agreement with this, because they both involve practical steps, but it's not 100% clear to me if this is the intention, and since this category is named "tutorials", the how-to topics might be overlooked when sigs evaluate the docs. You might want to explicitly state that tutorials can include any procedural topics in any of the various guides, not just the [Tutorials](https://www.o3de.org/docs/learning-guide/) ("learning guide") section. You could call this category "Procedural docs".

It is intended that both how-to topics normally found embedded in the user-guide and the dedicated tutorials found in the tutorial section of the site are counted towards a feature's documentation.

from sig-docs-community.

lmbr-pip avatar lmbr-pip commented on August 29, 2024

I do agree with the sentiment, though I would like to understand how doc users find and make use of this information. Is there linkage to it from o3de.org? Its not really covered how folks looking for documentation are expected to work with this information. Is it just for potential docs contributors?

As for the proposal, my main problem is that ratings still seem highly subjective and require a lot of cognitive thought to set up for SIG Chair(s).

Secondly, did you consider having a state of docs section, ie "Missing - Not Planned", Planned, In Active Development, Delivered or something to convey state of feature with relation to docs

Especially interested in how do we relate docs and features to the "dev" documentation? I may mark a feature as having rich/wonderful/completed docs but they may only be in the dev docs branch and thus invisible to most users.

from sig-docs-community.

sptramer avatar sptramer commented on August 29, 2024

Do we have a score/feedback system on each doc page? Something like "is this article helpful?" Or "on scale of 1-5 how do you think about this article"

I'm thinking pulling data like that and use it to generate the health matrix for docs is going to really reflect how customers think.

Could be as simple as thisimage

👍/👎 rankings are outside of the discussion of the feature grid at this time. That would require a separate RFC; engaging in this kind of user study requires cooperation from legal departments.

from sig-docs-community.

sptramer avatar sptramer commented on August 29, 2024

@lmbr-pip full comment

Hi pip I accidentally edited your comment instead of writing my own. Misclick that led to me overwriting some of your comments but I have left which ones I kept there, and moved my edits below.


I do agree with the sentiment, though I would like to understand how doc users find and make use of this information. Is there linkage to it from o3de.org? Its not really covered how folks looking for documentation are expected to work with this information. Is it just for potential docs contributors?

It would be part of the feature grid release notes (https://www.o3de.org/docs/release-notes/22-10-0/feature-state/). As for how they "make use" of this information - It's unclear. There are many ways that this can go wrong with under/over-evaluation (especially RED/GREEN), even if we were taking a data-driven approach to determine "quality" or "usefulness". We may need a more objective measurement system like NPP (negative sentiment evaluation).

As for the proposal, my main problem is that ratings still seem highly subjective and require a lot of cognitive thought to set up for SIG Chair(s).

Subjective evaluations would be performed by docs + the sig, under however they choose to do so. We understand that we will be consistently YELLOW or RED under this reporting rubric. It's about "sufficiency" of documentation for the product: Can a user onboard? Can they understand it? Is there full reference available?

Engineering teams who take dedicated time to focus on docs would be able to establish a baseline of quality ("is it better / worse than last time? Did we add stuff? Are things still missing? Do users report issues?"). We may need to hold this RFC until there's a way to perform better evaluations.

Secondly, did you consider having a state of docs section, ie "Missing - Not Planned", Planned, In Active Development, Delivered or something to convey state of feature with relation to docs

This would be the intent of a roadmap, not a feature grid. Feature grids are supposed to be for snapshots in time.

Especially interested in how do we relate docs and features to the "dev" documentation? I may mark a feature as having rich/wonderful/completed docs but they may only be in the dev docs branch and thus invisible to most users.

Do you report on the dev feature in the feature grid? If so, docs for the feature are also reported in the feature grid, on the same line item.

from sig-docs-community.

chanmosq avatar chanmosq commented on August 29, 2024

At sig-docs-meeting on 11/08/2022, we unanimously provisionally accepted this RFC, with the understanding that we need to establish a system of metrics for evaluating the quality, and address other concerns that are commented on this issue.

from sig-docs-community.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.