%0 Journal Article %J Reading Horizons %D 2016 %T The Common Core Writing Standards: A descriptive study of content and alignment with a sample of former state standards %A Troia, G. A. %A Olinghouse, N. G. %A Wilson, J. %A Stewart, K. O. %A Mo, Y. %A Hawkins, L. %A Kopke, R.A. %B Reading Horizons %G eng %0 Journal Article %J J. Statist. Planning Inf. %D 2010 %T Masking methods that preserve positivity constraints in microdata %A A. F. Karr %A A. Oganian %K constraints %K Positivity %K SDL method %K Statistical disclosure limitation (SDL) %X

Statistical agencies have conflicting obligations to protect confidential information provided by respondents to surveys or censuses and to make data available for research and planning activities. When the microdata themselves are to be released, in order to achieve these conflicting objectives, statistical agencies apply statistical disclosure limitation (SDL) methods to the data, such as noise addition, swapping or microaggregation. Some of these methods do not preserve important structure and constraints in the data, such as positivity of some attributes or inequality constraints between attributes. Failure to preserve constraints is not only problematic in terms of data utility, but also may increase disclosure risk. In this paper, we describe a method for SDL that preserves both positivity of attributes and the mean vector and covariance matrix of the original data. The basis of the method is to apply multiplicative noise with the proper, data-dependent covariance structure.

%B J. Statist. Planning Inf. %V 141 %P 31-41 %G eng %0 Journal Article %J Journal of Privacy and Confidentiality %D 2009 %T Global measures of data utility for microdata masked for disclosure limitation %A A. F. Karr %A A. Oganyan %A J. P. Reiter %A M.-J. Woo %B Journal of Privacy and Confidentiality %V 1 %P 111-124 %G eng %0 Journal Article %J Computational Statistics and Data Analysis %D 2009 %T Verification servers: enabling analysts to assess the quality of inferences from public use data %A J. P. Reiter %A A. Oganyan %A A. F. Karr %X

To protect confidentiality, statistical agencies typically alter data before releasing them to the public. Ideally, although generally not done, the agency also provides a way for secondary data analysts to assess the quality of inferences obtained with the released data. Quality measures can help secondary data analysts to identify inaccurate conclusions resulting from the disclosure limitation procedures, as well as have confidence in accurate conclusions. We propose a framework for an interactive, web-based system that analysts can query for measures of inferential quality. As we illustrate, agencies seeking to build such systems must consider the additional disclosure risks from releasing quality measures. We suggest some avenues of research on limiting these risks.

%B Computational Statistics and Data Analysis %V 53 %P 1475-1482 %G eng %0 Journal Article %J Computational Statistics & Data Analysis %D 2008 %T Pooled ANOVA %A Michael Last %A Gheorghe Luta %A Alex Orso %A Adam Porter %A Stan Young %B Computational Statistics & Data Analysis %V 52 %P 5215 %G eng %0 Journal Article %J IEEE TRANSACTIONS ON SOFTWARE ENGINEERING %D 2007 %T Techniques for classifying executions of deployed software to support software engineering tasks %A Murali Haran %A Alan Karr %A Michael Last %A Alessandro Orso %A Adam A. Porter %A Ashish Sanil %A Sandro Fouché %B IEEE TRANSACTIONS ON SOFTWARE ENGINEERING %V 33 %P 287-304 %G eng %0 Conference Paper %B Privacy in Statistical Databases: CENEX–SDC Project International Conference, PSD 2006 Rome, Italy, December 13–15, 2006 Proceedings %D 2006 %T Combinations of SDC methods for microdata protection %A A. F. Karr %A A. Oganyan %E J. Domingo–Ferrer %E L. Franconi %B Privacy in Statistical Databases: CENEX–SDC Project International Conference, PSD 2006 Rome, Italy, December 13–15, 2006 Proceedings %8 December %G eng %0 Journal Article %J The American Statistician %D 2006 %T A framework for evaluating the utility of data altered to protect confidentiality %A A. F. Karr %A C. N. Kohnen %A A. Oganyan %A J. P. Reiter %A A. P. Sanil %B The American Statistician %V 60 %P 224-232 %G eng %0 Conference Proceedings %B Proc. ACM SIGSOFT Symposium Foundations of Software Engineering 2005 %D 2005 %T Applying classification techniques to remotely-collected program execution data %A A. F. Karr %A M. Haran %A A. A. Porter %A A. Orso %A A. P. Sanil %B Proc. ACM SIGSOFT Symposium Foundations of Software Engineering 2005 %I ACM %C New York %G eng %0 Conference Paper %B In Statistical Methods in Counterterrorism: Game Theory, Modeling, Syndromic Surveillance, and Biometric Authentication %D 2005 %T Secure statistical analysis of distributed databases using partially trusted third parties. Manuscript in preparation %A Alan F. Karr %A Xiaodong Lin %A Ashish P. Sanil %A Jerome P. Reiter %E D. Olwell %E A. G.Wilson %E G. Wilson %B In Statistical Methods in Counterterrorism: Game Theory, Modeling, Syndromic Surveillance, and Biometric Authentication %I Springer–Verlag %C New York %G eng %0 Journal Article %J Environmetrics %D 1995 %T The ability of wet deposition networks to detect temporal trends %A Oehlert, Gary W. %K discrete smoothing %K wet deposition networks %X

We use the spatial/temporal model developed in Oehlert (1993) to estimate the detectability of trends in wet-deposition sulphate. Precipitation volume adjustments of sulphate concentration dramatically improve the detectability and quantifiability of trends. Anticipated decreases in sulphate of about 30 per cent in the Eastern U.S. by 2005 predicted by models should be detectable much earlier, say, 1997, but accurate quantification of the true decrease will require several additional years of monitoring. It is possible to delete a few stations from the East without materially affecting the detectability or quantifiability of trends. Careful siting of new stations can provide substantial improvement to regional trend estimation.

%B Environmetrics %V 6 %P 327–339 %G eng %R 10.1002/env.3170060402 %0 Journal Article %J Atmospheric Environment %D 1995 %T Shrinking a wet deposition network %A Oehlert, Gary W. %K Monitoring network %K network design %K spatial smoothing %K trend analysis %X

Suppose that we must delete stations from a monitoring network. Which stations should be deleted if we wish the remaining network to have the smallest possible trend estimate variances? We use the spatial-temporal model described in Oehlert (1993, J. Am. Statist. Assoc., 88, 390–399), to model concentration of sulfate in wet deposition. Based on this model and three criteria, we choose good sets of candidate stations for deletion from the NADP/NTN network. We use the criteria: that the sum of 11 regional trend estimate variances be as small as possible, that the sum of local trend estimation variance be as small as possible, and that the sum of local mean estimation variance be as small as possible. Good choices of stations for deletion result in a modest increase in criteria (about 7 to 34%) for 100 stations deleted from the network, while random sets of 100 stations can increase criteria by a factor of two or more.

%B Atmospheric Environment %V 30 %P 1347–1357 %G eng %R 10.1016/1352-2310(95)00333-9