SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Politics : Mainstream Politics and Economics -- Ignore unavailable to you. Want to Upgrade?


To: Wharf Rat who wrote (626)9/2/2011 10:11:07 PM
From: Nadine Carroll11 Recommendations  Read Replies (1) | Respond to of 85487
 
Wharfie, reproducibility of results is a cornerstone of the scientific method. When you depart from it, you have left science and entered the realm of religion.



To: Wharf Rat who wrote (626)9/3/2011 10:12:10 AM
From: Alastair McIntosh1 Recommendation  Respond to of 85487
 
Real scientists go out and collect their own data, and write their own codes.

Other real scientists will then often re-analyze data collected by others and examine the codes used.

Read this National Academy of Sciences report on Ensuring the Integrity, Accessibility, and Stewardship of Research Data in the Digital Age

nap.edu

Recommendation 5: All researchers should make research data, methods, and other information integral to their publicly reported results publicly accessible in a timely manner to allow verification of published findings and to enable other researchers to build on published results, except in unusual cases in which there are compelling reasons for not releasing data. In these cases, researchers should explain in a publicly accessible manner why the data are being withheld from release.

This principle may seem to apply only to publicly funded research, but a strong case can be made that much data from privately funded research should be made publicly available as well. Making such data available can produce societal benefits while also preserving the commercial opportunities that motivated the research. As discussed earlier, differences in technological infrastructure, publication practices, data-sharing expectations, and other cultural practices have long existed between research fields. In some fields, aspects of this “data culture” act as barriers to access and sharing of data. With the growing importance of research results to certain areas of public policy, the rapid increase of interdisciplinary research that involves integration of data from different disciplines, and other trends, it is important for fields of research to examine their standards and practices regarding data and to make these explicit.

Data accessibility standards generally depend on the norms of scholarly communication within a field. In many fields these norms are now in a state of flux. In some fields, researchers may be expected to disseminate data and conclusions more rapidly than is possible through peer-reviewed publications. Digital technologies are providing new ways to disseminate research results—for example, by making it possible to post draft papers on archival sites or by employing software packages, databases, blogs, or other communications on personal or institutional Web sites. Data sharing is greatly facilitated when a field of research has standards and institutions in place that are designed to promote the accessibility of data.