Coder Social home page Coder Social logo

acre's People

Contributors

abogdanoski avatar albertchae avatar chuyishang avatar em-ng21 avatar fhoces avatar joelferg avatar khoeberling avatar mweiss avatar myzhang01 avatar petezh avatar sungsy12345 avatar taranganathapa avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

acre's Issues

Improvements chapter updates

  • Connect with Tips Chapter for ideas for improvements (sentence + link)
  • Paper-level improvements - Add sentence about Future improvements
  • Display item improvements – Make use of links and language re: Comms guidance consistent throughout
  • Display item improvements – Make language re: checking previous reproductions consistent (lines 26, 36, 44). Where will the database be housed? Add sentence about what to do if someone has contacted the authors previously. [I don't think there is yet a straightforward way for users to "Use the SSRP to verify whether there previous reproducers have contacted the authors regarding this paper and the specific missing files." I suggest removing this for now, or saying something like "Eventually you will be able to check..., but for now, skip to step #."]
  • Display item improvements – Add intro sentence like the one under the paper-level improvements
  • Display item improvements – Line 22: I think we’re not still using the spreadsheet. If not, remove/revise to mention Assessment stage form
  • Display item improvements – Line 28: Does “costs” refer to time, $, something else? Should we be more specific about this? Students might not be aware of these.
  • Display item improvements – Line 50: Are steps 1-5 missing? Or is this referring to previous steps? Probably better to copy/paste if so
  • Display item improvements – Line 59: Refers to “ACRE procedure” – what is this? Add link? @fhoces
  • Display item improvements – Line 64: Copy/paste revised steps from Debugging Analysis Code section
  • Display item improvements – Line 77: Reporting Results – remove/revise reference to the spreadsheet

Abel B's feedback to Guide

Seems to switch focus back and forth between papers and display items. Be clearer about unit of analysis at each stage

  • Clarify that assessment is about computational reproducibility of display items; not claims or robustness checks
  • Maybe replace letters with numbers or levels?
  • Clarify that reproducibility scores are only used in Assessment and Improvements stages. We don't ask reproducers to assign scores to robustness checks.
  • Fix numbering inconsistencies between stages and chapter numbers
  • Add more tips with recommendations on best course of action for reproducer

Cover page updates

  • Replace ACRE -> SSRP
  • Update figure with stage 0 (select a paper) + match sub-stages/steps with the form
  • Finalize Guide title (Katie used “Guide for Advancing Computational Reproducibility in the Social Sciences” in Contributions chapter

Tips & Resources chapter updates

  • Add a bit of intro on where in the exercise these tips are relevant, e.g. “Reproducible workflows are based on three tools and practices, including standardized folder structure, literal programming, and version control. Here is a brief overview of what each of these entails based on a chapter from Christensen et al, 2019.”
  • 8.2: Same as above, this needs a bit of introduction and explanation of how these links can be used as part of SSRP.
  • 8.2.1: We should add these resources to the BITSS Resource Library and just link to the library from here (many of them are there already).

Core to do's

  • Update qualtrics surveys to reflect updated guidelines

  • Review AEA material in detail and incorporate elements from it

  • Creat one-pager

  • Incorporate template language

  • Update robustness section

  • Write up a section on concluding the exercise

Communications guidance

Introduction

  • Add a sentence on checking SSRP for previous contact/not over-burdening authors. Maybe also add a sentence on how instructors/advisors can supervise or moderate this process.
  • Add a sentence on how long this process might take, and when reproducers should plan to start
  • 1.c. (Line 0): Provide more guidance on what “how far” means (“...how “far” your results must deviate from the original work before claiming that the study could not be reproduced.”) Or rephrase away from "could not be reproduced" toward referencing the scale and specific missing items? @fhoces which did you mean here?

For Reproducers

  • 1. (Line 31): add information for other social science disciplines (might just be AJPS). Doesn't seem relevant anymore (@abogdanoski Did you edit this section?)
  • 5. (Line 46): Clarify where to record interactions on the platform
  • Contacting the original author to share the results of your reproduction exercise template:
  • Line 207: Remove “extensions” as step iv?
    Lines 208-214: The `` didn’t do what they’re supposed to. Not sure what the right syntax is here

For authors
disciplines (might just be AJPS).

  • remove “>” formatting/typing glitch throughout the template letters
    disciplines (might just be AJPS).
  • depending on the role of instructors per the anonymity solution, we may need to add a note about how to respond to instructors who reach out on behalf of a class. I’m not sure exactly what the specifics of talking to instructors vs. reproducers directly are. @fhoces I think we can leave this for now since I don't think Ted will do this, but we should ask instructors interested in the next phase how they want to do this.

Harassment/discrimination

  • Add language for other disciplines
  • Line 321: Can a non-economist join the AEA to file a complaint?
  • Line 333: I added a sentence in the mental health section to clarify that bullying, discrimination, and harassment are not the victim’s fault (original sentence could be triggering)

Scoping chapter updates

  • Migrate 1.1 to a separate section: Stage 0: Selecting a paper @fhoces
  • Add a definition of claim in this section
  • Verify whether the order and content of sub-headings/steps match the survey on the platform
  • 1.1, at step 1, we instruct reproducers to check for previous entries:
  • In which section of the platform is this available? @albertchae please let us know when some solution is available.
  • Is there any way a reproducer can see the reproduction package from a private reproduction?
  • In 1.2.1 Incorporate in the workflow: read the paper and consider these questions (include under 1.2 Scoping your declared paper)
  • Number “declare specific estimates” + “declare possible robustness checks”
  • Record the scope of the exercise: verify whether these flow logically with the rest of the chapter @fhoces
  • 1.3 should probably be 1.2.3 -- can you confirm that makes sense? @fhoces
  • 1.4 should be moved to the Introduction and placed right after “Reproduction Strategies”
  • Add guidance on how to declare robustness checks (narrative form, formula, etc.) and implications for later stages of the exercise (especially if the Scoping section is locked after completing it) @fhoces, please draft

General language updates throughout the platform

Replace references throughout:

  • “ACRE platform/database” -> Social Science Reproduction Platform;
  • “exercise” -> reproduction and or reproduction attempts
  • Guidelines –> Guide
  • Economics –> Social Sciences (as applicable)
  • Remove references to Surveys 1 and 2
  • Replace “output” with the relevant unit of analysis, e.g. display item, claim, etc. (code is relevant for products of scripts)

Assessment chapter updates

  • Introduction: Replace “output” with “display item”
  • Remove Survey 2 references
  • 2.1: Add definitions for raw vs. analytic data in addition to cleaning and analysis code
  • 2.1.1: Delete reference to the “standardized spreadsheet” and required structure; explain that this is part of the survey and each line is a data source @fhoces
  • 2.1.2: same as 2.1.1 @fhoces
  • 2.1.3: same as 2.1.1@fhoces
  • 2.2: replace output-to-be-reproduced with relevant unit of analysis @fhoces
  • 2.2.1: replace “ACRE Diagram Builder” -> diagram builder @fhoces
  • 2.2.2: verify that the workflow for incomplete diagrams is still the same, especially when the diagram needs to be edited manually (the survey isn’t super clear here and it just points to this section) @fhoces
  • 2.2.2: Remove reference to the spreadsheets + Make a reference to examples in Ch. 7
  • 2.3: replace “output” with “display item”
  • 2.3.1: remove reference to Survey 1 (this step is automatically incorporated into the survey workflow now in Step 0)
  • Table 3.1: Raw data information in the Guide doesn't have the same columns as the platform. @fhoces please update it to match.
  • Add instructions on what it means to check the "Provided/Cited" columns. @fhoces please review whether my draft is correct.
  • @fhoces please draft a section (place it before Assess display items) corresponding to the 3.Master script step on the platform.
  • @fhoces Add to 3.1.3.1 Common mistakes “If tree shows multiple files as one input, make sure to separate them with a semicolon (;).” or something to this effect

To do for Module

Book

  • Replace "output" with "display item" throughout; clarify the distinction between display item, claim, and estimate @fhoces
  • “Closed for reproductions”: clarify what happens if the author later on releases the data? @fhoces
  • Number the stages to imply chronological order of each
  • Reproducibility scale: restructure scores around three reference points: L1=no data, no code -- L5 CRR -- L10 CRA
  • Reproducibility scale: translate levels for admin/restricted-access data @fhoces
  • Set up a survey/voting system for reproducibility level, but highlight that our value judgment is that data is more important than code for reproducibility
  • Write an email to authors requestion reproduction material (line 48) @abogdanoski
  • Capitalize ACRE Diagram Builder throughout
  • **Spell out abbreviations and acronyms throughout, particularly decision tree in 1.3, proprietary and confidential data levels
  • clarify that last tree diagram is done manually @fhoces
  • verify survey + spreadsheet links throughout the guidelines @joelferg
  • Write template language for authors to respond to requests (line 253) @abogdanoski
  • convert tables into HTML + test that formatting works across browsers @joelferg

Survey 1

  • add a question about email ("we will use this to send your Survey 2").
  • add a question about contacting authors with simple text entry
  • Add a question to prespecify robustness checks and extensions, if applicable
  • Add “other” as an option for the context of reproduction exercise in Survey 1.
  • Add other in Q7
  • Increase the number of claims in Q8. Add an example of a claim.
  • Add “doing all claims in the paper” in Q9.
  • In Q10 make sure units of measure at the same.
  • Switch 13 and 14, and note that the general population may be the same as the one to which estimates apply.
  • Q18 use “preferred analytical specification”
  • Q19 change to “choose of up to 5” specifications.
  • Q17 and Q18 put together in the same table/page. Add “other” field. Include appendix, supplementary materials, etc.
  • Add info about which specification is the preferred one. Or pick one as the reproducer.
  • 19 move to the beginning of the paper.
  • Add a question at the beginning on “to what extent are you familiar with paper? Have you evaluated it before?”
  • Q5.2 Add level 4
  • Q5.4 add language that the number of claims requires some subjective assessment
  • Q6.8 add a 4th row to enter statistic othen than S.E.
  • add back buttons throughout @joelferg
  • Q5.6 add an option to say more than 6 but not all claims
  • Q6.1 make sure than # is filled out with the actual number of the claim
  • set up a survey completion report at the end of the survey + instructions on how to access Survey 2 @joelferg
  • set up a confirmation email with completion report at the end of the survey + instructions on how to access Survey 2 @joelferg

Survey 2:
From 4.10 meeting:

  • Add a question about the reproducer's name at the beginning.
  • Q8.3 -- Add "other" as a possible improvement (e.g. an entire file may contain the input for a display item). @fhoces @joelferg
  • Q8.4 -- Remove semicolon input requirement
  • Q8.5 -- Wording is a bit confusing. Change to "Relative to the original repro package, have you been able to include raw or proprietary data? Add explanation for context (e.g. many times there’s restricted access that authors can’t share). Possible answers: 1) Data is restricted access; 2) Data is now public; 3) Data is restricted access, but I was able to reproduce it privately. @fhoces
  • Q8.7 -- Add: “ Given the improvements that you have added …”
  • Robustness introduction -- Reword to emphasize it's about meaningful/important analytical choices (provide example). This is a subjective judgment, and you can choose to focus on just the important stuff, but can also analyze all. @fhoces
  • Q9.1 -- Add a note: Your time spent will not factor in your grade.
  • Comments in final section -- Add comments about own reproduction package + feedback for overall ACRE exercise and materials.

--

From 4.8 meeting:

  • Introduction: change due time to 2 pm
  • Introduction: add "You may not be able to navigate back from certain loops. This is a programming issue that we hope to resolve in future iterations of the exercise"
  • Q.2.1-3 add/fix link to assessment spreadsheet
  • make sure all links to assessment spreadsheets open up in a new tab/window (otherwise, navigating back removes all inputted data) @fhoces @joelferg
  • replace "output" with "display item" throughout the survey
  • replace "best practices" with "reproducibility tools and practices" throughout Survey 2 and the Guidelines
  • Q2.8 add explanation of what constitutes a "display item"
  • Q2.8 make it possible to enter more than 10 outputs/display items @fhoces
  • Q3.3 change to “For tables and figures, you would use the title of the table or figure”.
  • Q3.7 clarify that tree diagram is for students' reference and is not required to be submitted
  • Q5.1 reword answers @abogdanoski
  • Q5.4 combine first two answers into one
  • Q5.5 add 3 answer for time spent doing other minor corrections
  • Q5.7 explain subjectivity of assessment
  • Q6.1 Reword message from economics at large to specific paper
  • Q3.7 explain how students can build the tree if they need to do it themselves @fhoces
  • Update guidelines to reflect the change in terminology of "display item"
  • Q7.2 Clarify that this is about improvements made by the reproducer + give examples of improvements possible

--

  • integrate mapping spreadsheet into Survey 2 @joelferg
  • Change "minimal effort" definition to one hour or less
  • Make it a bit more obvious when assigning the score that it output-level, rather than paper level
  • Q14.7: Spell out acronyms from the reproducibility scale
  • Q13.5: Make it possible to select multiple answers, correct grammar (remove "did")
  • Q14.1: Change "table 1" to "output 1"
  • Q14.6 Change answers to more elaborate statements, e.g. "Yes, exactly the same;" "No, but quite similar" + add a short text box to describe differences
  • draft brief instructions on how to complete mapping part of the exercise using the new tool @joelferg

Survey 3:

  • on hold for now, check with Ted if robustness checks will be a part of the exercise

General:

  • confirm with Ted that papers will be pre-selected
  • Survey 1 is timestamped when submitted and non-modifiable after. Survey 2 should be possible to go back to until you’ve submitted it.
  • re-order Q numbering in surveys once done with editing
  • draft emails to distribute the surveys
  • confirm point person for questions from students during class
  • make Robustness optional and figure out a way to record students' work for it @fhoces
  • create a glossary of terms (e.g. computational reproducibility, raw + analytic data, data source + data file, cleaning + analysis code, reproducer, original author, etc.).

Timeline: Survey 1 to be posted 3/30, due 4/10; Survey 2 posted 4/13 (or earlier if ready) due 4/24

Post-pilot revisions and open tasks for ACRE Guidelines and Surveys

Timeline and responsibilities: https://docs.google.com/spreadsheets/d/1vr0DAO78W_oe-eAB7hOHzrIU83YgfdB8wlaRPKVyJcc/edit?usp=sharing.

Open items

Guidelines

Content

  • In 1.1.1 outline in a bit more detail the workflow from candidate to declared paper @fhoces

  • At the end of section 1.2, write a reminder to set up a revised reproduction package @fhoces

  • Provide more guidance on how to add and scope claims in Scoping chapter @fhoces

  • Template email for communications in unused data scenario @abogdanoski

  • Draft chapter on "Concluding the Reproduction" w/ description of the desired outputs + guidance on how to structure reproduction package + info on how we will use the results of the reproduction. This would probably be an added chapter to be inserted between Ch. 6 and 7 in current draft. @fhoces

  • Chapter 6: Draft introduction explaining diagram examples @fhoces

  • @fhoces Add figures to Robustness chapter and notify @joelferg

  • Ch. 7.2: Resources (@fhoces to draft a table with i) general resources for comp. reproducibility, ii) examplary projects, iii) exemplary authors)

  • Fernando to review updated AEA template, then modify for ACRE purposes @fhoces @joelferg

  • Ch. 8: Tips for reproducibility (draft complete, @khoeberling reviewed; comments in pull request)

  • Ch. 9: Update licensing, including suggested citations format (@khoeberling)

  • Review references and integrate bibtech @abogdanoski

  • Ch. 10: Glossary of terms (draft complete, review pending @fhoces)

  • Platform Code of Conduct @abogdanoski @khoeberling

Surveys

Survey 1
Content

  • Add questions about pre-specifying robustness checks @fhoces
  • Q6.8: add a choice to report p-values @abogdanoski
  • Q6.8: make clear that they need to input numbers @fhoces
  • Q6.8 force formats in fields @em-ng21 (?)

Survey 2

  • Clarify order of reproducibility assessment questions: first about paper-level reproducibility assessment, then about each individual display item @fhoces
  • Clarify connection between Survey 1 (claims) and Survey 2 (display items) @fhoces
  • Develop grading rubrics, consider ways to automate student reports @fhoces @em-ng21
  • Fix skip pattern with "I am not sure" option for scoring levels @fhoces [clarify what this means] @fhoces
  • Add "data format" and "specific software" to improvement options @abogdanoski
  • Add a question on whether differences in display item relate to pre-specified claims (or correspond to a different section of the DI that does not fundamentally connect to no relate the claim) @fhoces [reword or clarify what this means]
  • Add questions where reproducers must score some parts of others reproducers (summary of claims, repro package, etc). @joelferg (?)
  • Q7.2 Clarify that this is about improvements made by the reproducer + give examples of improvements possible (?)

Spreadsheet

  • Make separate, view only example workbook and move all example sheets there @joelferg
  • Populate example workbook with better/more concrete/more examples @joelferg @fhoces
  • Confirm all links function correctly @joelferg
  • Remove hyperlinks in .do files/make it so specific file types don't automatically become hyperlinks @joelferg
  • Include something about apostrophes before +'s in improvement sheet (?)

Completed items:

Guidelines

  • Add Stage 0, i.e. guidance on selecting a paper and communicating with authors to ask for missing reproduction package @fhoces
  • Chapter 8: Additional resources @joelferg @em-ng21
  • Template email for communications at Stage 0 @abogdanoski
  • Chapter 4: Robustness @fhoces
  • Build examples of "incomplete workflows" and feed them to the diagram builder @joelferg and @fhoces to workshop a solution
  • Chapter 9: Contributions @em-ng21 @khoeberling
  • In Ch. 2.1: explain how to distinguish between different data and code files (?)
  • Develop examples for reproduction strategies @joelferg @fhoces Formatting and Aesthetics
  • Convert tables and figures into HTML; check operability across browsers @em-ng21
  • Create a cover @em-ng21
  • Tips and resources for reproducibility chapter draft @em-ng21

Survey 1

Content

  • Create an automated report @fhoces @em-ng21 Formatting
  • set up navigation back and forth

Survey 2

Content

  • In the Introduction, clearly state the desired outputs of the overall exercise and especially an updated reproduction package @fhoces

Diagram Builder @joelferg

  • Automate diagram building
  • Update the survey and guidelines with an example using "unused data sources"
  • Fix bug with a comma separating files: option 1: coded up, option 2: add instructions to materials (ie survey and guidelines) to avoid this problem. (eg R_8wR08MAw9yCyQdH_ACRE_diagram.txt)
  • Add a separation between unused data sources and unused data files.
  • Figure out why some diagrams don't have a name for output (eg R_bvIfoWKuIvhj877_diagram.txt)
  • Ask flexibility to question 10.1
  • Ask about more annotations.
  • Ask to add a vertical line (low priority)

General

Immediate

  • Create a script to automate the list of contributors @em-ng21
  • Allow unrestricted navigation back through both surveys (consider implementing breadcrumbs navigation system) @fhoces
  • Create a template of reproduction package including recording sheets and diagram trees @fhoces @joelferg
  • Language review/ copy editing @abogdanoski @khoeberling
  • 1 slide for WEAI presentation with a timeline for key milestones @abogdanoski
  • Match structure and numbering of questions in surveys with the structure of guidelines @abogdanoski
  • Make explicit why we want them to report separate files for different code functions (e.g. based on coding conventions for reproducibility). @fhoces
  • Check terminology and wording for consistency between guidelines and surveys @abogdanoski
    • scale with own data
    • scale with admin/confidential data
    • outputs -> display items

other to do's

  • Verify that searching capability works

  • figure out deployment server

  • add examples

  • figure out right formatting for epub

  • Improve assessment spreadsheet (examples, preformat cells, make a copy button, etc)

  • write up examples of reproduction strategies

  • fix bug with "Extenstions" title in cover @joelferg

  • Replace all links to standarized spreadsheet to final version (that makes a copy) before publication @joelferg

Introduction chapter updates

  • Add instructions on how to use the Guide together with the platform (e.g. read the guide first or read them side by side)
  • Add a note that definitions of basic concepts can be found in the Glossary
  • Add a note: the guide principally intended for reproductions of economics research, but may be used in other social science disciplines. Contribute with “translations” to another discipline
  • Number sub-headings in this chapter
  • In “Stages of the exercise”: add Stage 0
  • Fix: “These guidelines do not include a possible fifth stage of extension. Here you may extend the current paper by including new methodologies or data. If you where to extend the same methodology and research question into a different sample, that would bring you closer to a replication.”
  • Clarify the chronological sequence of the stages: they are not (?) based on a strict order except for Scoping (also explain why that is and note that user won’t be able to edit once they’ve completed that part of the form). @fhoces
  • Remove “Recording the results of the exercise”
  • Explain figure 0.3 about relevant unit of analysis at each stage (add in “beyond binary judgments) @fhoces

Add instructions on recording big data files in revised reproduction package

To add in Improvements section:
If you think that your reproduction package will exceed this limit please do the following:

  • Separate your reproduction package in two: (1) data and (2) code and documentation.
  • Post the second one in a trusted repository.
  • For the data reproduction package identify all the files that are different from the original reproduction package and upload only those. For example, suppose the original repro package has data/raw_data1.csv, data/clean_data/data_set1.dta. If you modify only the file data_set1.dta, then upload a revised reproduction package that has the same folder structure but only the files that differ: data/clean_data/data_set1.dta. If you want to make it even more clear you could add a readme file describing the modified files.

Examples of reproduction trees

  • Reference this chapter wherever appropriate (in Assessment when we first introduce the diagrams and possibly in Scoping)
  • Replace “ACRE diagram builder” -> “diagram builder”

next stage

  • build code that analyses the incoming data from surveys and spreadsheets

Definitions to draft

  • Reasonable specification
  • Specification
  • Digital Object Identifier
  • Revised reproduction package

To do for spreadsheet

  • Make separate, view only example workbook and move all example sheets there
  • Populate example workbook with better/more concrete/more examples
  • Remove hyperlinks in .do files/make it so specific file types don't automatically become hyperlinks
  • Include something about apostrophes before +'s in improvement sheet
  • Confirm all links function correctly

reviewing responses to survey 2 of pilot 2 #36

  • ask for complete URL for repositories

  • Set up individual survey per display item

  • clarify the connection between survey 1 (claims) and survey 2 (display items)

  • Fix skip pattern with "I am not sure" option for scoring levels.

  • Create a template of reproduction package including recording sheets and diagram trees.

  • Emphasize at the beginning of the survey that needs to create a repro package.

  • Create automated reports

  • Provide several template grading rubrics.

  • Add "data format" and "specific software" to improvement options

  • Add question on weather differences in display item relate to pre-specified claims (or correspond to a different section of the DI that does not fundamentally connect to no relate the claim)

  • Add questions where reproduced must score some parts of others reproducers (summary of claims, repro package, etc)

Robustness chapter updates

  • paragraph 5 says that robustness is assessed at the claim level -- should this be mentioned earlier in the chapter? @fhoces please confirm
  • It’s a bit unclear where “entry id” comes from, though it seems that it’s simply the order in which the analytic choice is recorded @fhoces - write instructions
  • 4.2 remove leftover comment “[should this be “robustness check” instead?]”
  • What does the mapping database look like on the platform? @fhoces - add instructions

aesthetics

  • Turn figure 2 into and HTML figure + table

  • [ ]

  • [ ]

Definitions to revise

  • Claim
    Is causal vs descriptive the only categorization of claims?
    Alternative definition (from RepliCATs): “A research claim is a single major finding from a published study (for example, a journal article), as well as details of the methods and results that support this finding. A research claim is not equivalent to an entire article. Sometimes the claim as described in the abstract does not exactly match the claim that is tested. In this case, you should consider the research claim to be that which is described in the inferential test, as the next stage of SCORE will focus on testing the replicability of the test results only.”
  • Coding error - remove leftover comment “[Aleks/Fernando]” + remove “ACRE procedure”
  • Disclosure -- this sounds like transparent reporting but is called “disclosure”; can you confirm that “disclosure” is the most appropriate term to define here?
  • Literate programming - replace “best practice”-> reproducibility practice
  • Researcher degrees of freedom -- consider extending this definition
  • Reproduction package - remove “ At this point you are only assessing the existence of one (or more) reproduction packages, you will not be assessing the quality of its content at this stage.”
  • Analysis code – move from 10.2 to 10.1
  • Display item

Concluding the Reproduction chapter updates

  • Introduction: “If you wish to modify your reproduction after submitting it, you will have to record a new reproduction attempt on the platform and link to the previously completed reproduction.” Sounds right for now, but do we want to make it easy to copy an old reproduction? If so, should this be included in the Guide? @fhoces
    5.1
  • Line 11: Aleks left two questions here about reproduction package size and referencing original files if don’t have permission to share: "If your new reproduction package is larger than 2Gb (?)[A: how is this determined, it sounds arbitrary?], or it contains data that you don’t have permissions to share, remove the specific files from your reproduction package and add a reference to the original reproduction package [A: what does making a reference mean and look like in this context? I suggest providing an example of how to do that.].” 1. Download the repository 2. Access and give relative path or 1. Go to website, download/obtain data 2. deposit that data in your own package @fhoces
  • Line 13: Indent more
  • Lines 15, 16: Should these be greyed out?
  • Line 19: Reproduction reports – I think the final language here will depend on the privacy/sharing options. We’ll also need to state who these are sharable with. @fhoces Lets make sure this works with our anonymity/sharing options.
    Anonymity and data sharing:
  • Line 31: Select and update embargo period; suggested language for separating identifiable data from unidentifiable @fhoces Please review the language I suggested here.
  • Line 33: Update to reflect anonymity/privacy decisions

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.