Images of Washington Institutions
May 29, 2012      Volume 31, Issue 10

                                                                                                                                                                                                            COSSA Washington Update
In This Issue
SBE Advisory Committee Holds Meeting
Census Bureau Releases 2010 Count Accuracy Data; GAO Offers Advice for 2020
Spending Panels Approve FY 2013 DHS Spending Bills: Increases for Research, Development and Innovation
National Geospatial Policy Focus of Legislation
NSF Hosts Global Summit on Merit Review: International Scientific Collaboration Necessitates Common Principles
Department of Education Releases Annual "Condition of Education" Report
Education Policy Center Report Examines Student Motivation
No Credible Findings Regarding Research on Deterrence and the Death Penalty According to NAS Panel

SBE Advisory Committee Holds Meeting

With the recent action in the House eliminating its political science program (see Update, May 14, 2012) on people's minds, the National Science Foundation's (NSF) Social, Behavioral and Economic Sciences (SBE) directorate brought its Advisory Committee (AC) to NSF headquarters on May 17 and 18.  The AC, chaired by Anno Saxenian of the University of California, Berkeley, heard from a number of Foundation and Directorate officers about a variety of topics and issues.

Cora Marrett, NSF's Deputy Director and the first Assistant Director (AD) for SBE, was keenly aware of the difficult situation the House had bestowed on the directorate, but as always saw the glass half-full. She spoke of the "time of opportunities" for SBE, given the large interest in fundamental research on human behavior and social organizations.   She also said it was as very bad idea to "cherry-pick" among disciplines, given NSF's holistic approach to science.

Of course, Marrett admitted, there were limits on SBE's activities because of the current overall fiscal constraints and the continued practice of annual budgeting. This has necessitated partnerships that SBE has created with other agencies, such as the National Oceanic and Atmospheric Administration (NOAA) for cooperative projects on climate change decision-making and risk. She also noted the new cross-agency Neuroscience initiative highlighted in the House Commerce, Justice, Science Appropriations Subcommittee report.   Mark Weiss, director of SBE's Behavioral and Cognitive Science division, mentioned a new attempt at coordinating efforts in forensic science that has brought SBE together with the National Institute of Justice.  

Mumpower New SES Division Director

Myron Gutmann, the current AD for SBE, updated the Committee on the directorate's activities. He announced the appointment of Jeryl Mumpower, currently at the Bush School of Government and Public Service at Texas A&M, as the new director of the Social and Economic Sciences (SES) division. Mumpower, a former program officer for Decision, Risk, and Management Science, will replace Rachel Croson, who will return to the University of Texas at Dallas in September after two years at NSF. Mumpower previously taught and served in administrative positions at Albany University, State University of New York. He has a Ph.D. from the University of Colorado in social and quantitative psychology. 

Jeryl Mumpower

 

Gutmann reviewed the FY 2013 budget request (see Update, March 20, 2012).   He also noted SBE's role in some of NSF's cross-directorate initiatives including Cyberinfrastructure for the 21st Century (CIF-21), Science, Engineering and Education for Sustainability (SEES), and Expeditions in Education, the National Cybersecurity Initiative, the Innovation Corps, and Integrated NSF Support for Promoting Interdisicplinary Research and Education (INSPIRE).  

In addition, he mentioned other possibilities for FY 2013 including: enlarged international partnerships, specifically with European funding agencies; another round of NSF-Census Research Network activities; and workshops to determine the next steps in funding science on learning after the Science of Learning Centers end in 2014. Gutmann announced a mid-September conference of Principal Investigators from the Department of Defense's Minerva Project; the NSF version of Minerva, now called the Social and Behavioral Dimensions of National Security, Conflict and Cooperation; and the Research Council of the United Kingdom's Global Uncertainty program.

Data and Survey Issues

Jon Krosnick of Stanford University and the chair of the AC's Subcommittee on Data and Surveys reported that his group was exploring the results from the National Academies' Committee on National Statistics' panel on the Future of Social Science Data Collection, chaired by Roger Tourangeau of the University of Maryland and former COSSA Executive Committee member.   Krosnick and his colleagues are also investigating how the SBE's three big surveys - the Panel Study on Income Dynamics, the American National Election Studies, and the General Social Survey - could possibly combine some of their activities to achieve economies of scale and free up funds for new data collections.

Farnam Jahanian, current Assistant Director for NSF's Computer and Information Science and Engineering directorate (CISE), spoke to the Committee about the Big Data Initiative.   He noted the current data deluge-there are 5.3 billion mobile phone subscribers-that is transforming science. He suggested that we have moved from hypothesis-driven to data-driven discovery. Yet, problems such as privacy, confidentiality, data management and data access remain.

Rachel Croson and John Yellen, program director for Archaeology, gave SBE's take on the Big Data situation. The directorate has undertaken four initiatives in this area:

            1) A solicitation on Metadata for Large Social Science Surveys to improve the accessibility and usability of existing data resources;

            2) The Digging Into Data Challenge, a competition to promote innovative humanities and social science research using large-scale data analysis;

            3) The NSF-Census Research Network to improve the timeliness, longitudinality, and dimensionality of social and economic science data through research and education activities; and

            4) Building Community and Capacity for Data-Intensive Research in the Social, Behavioral, and Economic Sciences and in Education and Human Resources to enable the development of new, large-scale next generation data resources and analytic techniques.

The National Center for Science and Engineering Statistics (NCSES) is the directorate's division for the collection, analysis, and dissemination of information about the science and engineering enterprise. It has undergone a transformation with the retirement of long-time former director Lynda Carlson. John Gawalt is the current Acting NCSES director.

Staff told the AC that NCSES has established a Survey Sponsor Data Center that will provide a secure room at NSF for researchers to gain access to the data from the National Survey of College Graduates and the Business Research and Development and Innovation Survey (BRDIS).   Working with the Census Bureau, NCSES hopes to improve its data collection programs and its ongoing methodological research on the accessibility of confidential data.   One possible roadblock is the need for IRS approval to gain access to some of the BRDIS data.

NCSES is also redesigning its Scientists and Engineers Statistical Data System program.   It will discontinue the Survey of Recent College Graduates since the American Community Survey now provides more detailed information of young college graduates (assuming the ACS survives as a viable data collection instrument).   The Survey of Doctoral Recipients will also continue without any changes. NCSES is also moving forward on its administrative records project to help it report research and development agency and program budgets.  

The AC also heard about NSF's role in the Neuroscience Initiative mentioned above and a report on the Science of Broadening Participation-an attempt to systematically analyze, through research, the challenges to diversifying the scientific and engineering enterprise.

The Advisory Committee's next meeting will take place on November 15-16, 2012.

 

 

Census Bureau Releases 2010 Count Accuracy Data; GAO Offers Advice for 2020

While the Census Bureau waits for the Senate and the Administration to determine the fate of its American Community Survey (the House voted to abolish it, see Update, May 14, 2012) and whether Congress will appropriate enough money to carry out its other functions, including the 2012 Economic Census, the Bureau on May 22 looked back at the accuracy of the 2010 count.

The Bureau released the results of its post-enumeration survey called Census Coverage Measurement. By surveying a sample of the 300.7 million people living in housing units and then matching the responses to the census, the Bureau can estimate the error rates in the 2010 decennial.

The results, according to the Bureau, "found that the 2010 Census had a net overcount of 0.01 percent, meaning about 36,000 people were overcounted in the census." This is not statistically different from zero. By comparison, the 2000 Census had an estimated net overcount of 0.49 percent and the 1990 Census had a net undercount of 1.61 percent.   Census director Bob Groves indicated his pleasure at the results, noting: "on this one evaluation - the net undercount of the total population - this was an outstanding census."

 

The Census Bureau also released estimates of the components of coverage: the number of correct census records, erroneous enumerations and omissions. The Census Bureau estimates that among the 300.7 million people who live in housing units, about 94.7 percent were counted correctly, about 3.3 percent were counted erroneously, 1.6 percent provided only a census count and had their demographic characteristics imputed, or statistically inserted, and 0.4 percent needed more extensive imputation after all census follow-up efforts were attempted. Among those erroneously counted, about 84.9 percent were duplicates. The remainder was incorrectly counted for another reason, such as people who died before Census Day (April 1, 2010), who were born after Census Day or were fictitious census records.


There were, according to the Bureau an estimated 16 million omissions in the census. Omissions include people missed in the census and people whose census records were unverifiable in the post-enumeration survey because they did not answer enough of the demographic characteristic questions in the census. Of the 16 million omissions, about six million were likely counted in the census, but the post-enumeration survey could not verify them.

 

Difficulties Remain in Hard-to-Reach Populations

 

Breaking down the under/over count by demographic characteristics revealed continued difficulty in counting hard-to-reach populations. As with prior censuses, coverage varied by race and Hispanic origin. The 2010 Census overcounted the non-Hispanic white population by 0.8 percent, not statistically different from the overcount of 1.1 percent in 2000.


The 2010 Census undercounted 2.1 percent of the black population, which was not statistically different from a 1.8 percent undercount in 2000. In 2010, the Census undercounted 1.5 percent of the Hispanic population. In 2000, the estimated undercount of this group was 0.7 percent; not statistically different from zero. The difference between the two censuses was also not statistically significant.

The Census Bureau did not measure a statistically significant undercount for the Asian or for the Native Hawaiian and Other Pacific Islander populations in 2010 (at 0.1 percent and 1.3 percent, respectively). These estimates were also not statistically different from the results measured in 2000 (a 0.8 percent overcount and a 2.1 percent undercount, respectively).


Coverage of the American Indian and Alaska Native population in the 2010 decennial varied by geography. American Indians and Alaska Natives living on reservations were undercounted by 4.9 percent, compared with a 0.9 percent overcount in 2000. The net error for American Indians not living on reservations was not statistically different from zero in 2010 or 2000.

 

The 2010 Census undercounted renters by 1.1 percent, showing no significant change compared with 2000. Homeowners were overcounted in both the 2000 and 2010 censuses. However, the 2010 Census reduced the net overcount for homeowners from 1.2 percent to 0.6 percent. The Census more likely duplicated renters than owners.


Men 18 to 29 and 30 to 49 were undercounted in 2010, while women 30 to 49 were overcounted, a pattern consistent with 2000.

 

For further information about the accuracy of the 2010 count go to:

http://2010.census.gov/news/press-kits/ccm/ccm.html.

 

GAO Recommends Additional Steps for 2020 Planning

 

In keeping with its strong oversight role of the Census Bureau, in May, the Government Accountability Office (GAO) released a report, "2020 Census: Additional Steps are Needed to Build On Early Planning."  

 

GAO notes that its prior work has shown that to ensure a cost-effective 2020 decennial, the Bureau must "reexamine its management and culture as well as the fundamental design of the census." It commends the Bureau and its leadership for the consistency in its early planning and preparation efforts for the 2020 Census. GAO reports that the Bureau has taken steps "in accordance with selected leading practices that GAO identified for (1) organizational transformation, (2) long-term project planning, and (3) strategic workforce planning."

 

According to GAO, the Bureau, with the Director leading, is undertaking an organizational transformation of its entire decennial directorate in order to improve collaboration and communication across its divisions, improve operational efficiencies, and instill a culture that encourages risk-taking and innovation without fear of reprisal. This will help, GAO asserts, to more-effectively control costs and enumerate the population for 2020.

 

Yet, GAO declares: "The amount of change-related activity the Bureau is considering as part of its

reorganization of its decennial directorate may not be aligned with the resources the Bureau has allocated to plan, coordinate, and carry it out, and, as a result, the planned transformation efforts may not be sustainable or successful."

 

GAO also commends the Bureau for "taking steps consistent with many of the leading practices for long-term project planning,"   It has, according to GAO, created a high-level schedule of program management activities for the remaining phases, documented key elements such as the Bureau's decennial mission, vision, and guiding principles, and produced a business plan, updated annually, to support budget requests.

 

However, GAO suggests: "The Bureau's schedule does not include milestones or deadlines for key decisions needed to support transition between the planning phases, which could result in later downstream planning activity not being based on evidence from such sources as early research and testing."

 

In addition, GAO criticizes the Bureau for failure to effectively engage in outreach to the Bureau's congressional stakeholders about its reexamination of census processes and design, "which could result in a lack of support on potentially complex or sensitive topics that can be crucial for creating a stable environment in which to prepare for a census."

 

With regard to the area of strategic workforce planning, GAO finds that the Bureau has identified current and future critical occupations with a pilot assessment of the skills and competencies needed for the selected information technology 2020 Census positions. Again, GAO suggests the Bureau can do better. It has, GAO notes, done little yet to identify the goals that should guide workforce planning or to determine how to monitor, report, and evaluate its progress toward achieving them, which could help the Bureau identify and avoid possible barriers to implementing its workforce plans.

 

Of course, abolishing the American Community Survey (ACS), as the House has voted, will make all of this planning more difficult and costly. That is why over 550 organizations, including COSSA, signed a letter to the Senate, and why groups all over the country are trying to convince the Senate to keep the ACS alive and to not make it voluntary.

 

For access to the GAO report, go to: http://www.gao.gov/products/GAO-12-626.

Spending Panels Approve FY 2013 DHS Spending Bills: Increases for Research, Development and Innovation

 

The House Appropriations Committee on May 16 and the Senate Appropriations Committee on May 20 approved their respective FY 2013 spending bills for the Department of Homeland Security (DHS).

 

In both bills the Research, Development and Innovation (RD&I) account received substantial increases over FY 2012, with the Senate panel a bit more generous.   The Senate Committee recommended $478 million, the same as the President's request, and the House Committee $405.6 million. The FY 2012 figure was $265.8 million. The Administration had requested and both Houses had gone along with, a plan in FY 2012 to allow the Department to decide how to spend the funds within the research initiatives in the account. For FY 2013, the Senate panel went back to proscribing RD&I spending including $143.7 million for Disaster Resilience; $25.5 million for Counter Terrorism; and $64.5 million for Cyber-security research. Since the House reduced the funding from the request, it told the Department to submit a funding plan 15 days after the date of the enactment of the appropriations bill with project-level details on how the Science and Technology Directorate intends to fund individual research initiatives within each area.

The University Programs account, which funds the Centers of Excellence program, received $40 million for FY 2013 from each panel, the same as the President's request. The House added some directive language: "The Committee believes that it would help maximize the Centers' return on investment, and be consistent with previous suggestions by the Administration, if the competitive awards made to the Centers each year were based on performance reviews, conducted as part of the University Programs' internal review process."

 

The Committee also told DHS to evaluate how it might establish an Experimental Program to Stimulate Competitive Research (EPSCOR) program, similar to the long-standing program at the National Science Foundation (NSF) that has since spread to other agencies. It directed DHS to consult with NSF on this request.

 

The timing of House and Senate floor action on the bill is uncertain, although Congress has usually completed work on the DHS spending bill before the start of the fiscal year on October 1.

 

Other FY 2013 Appropriations News

 

On May 24, the Senate Appropriations Committee approved its version of the FY 2013 State and Foreign Operations spending bill. It recommended a significant increase for the State Department's Educational and Cultural Exchange programs. For FY 2013 the panel provided $625 million; $26.2 million above FY 2012, and $38 million above the President's request. Of that total, the Committee allocated $352.5 million for Academic Exchanges, an increase of $15.2 million over FY 2012.

 

In the same bill, the Senate Committee provided $38.2 million for the United States Institute of Peace, $800,000 below FY 2012, but $800,000 above the President's request.

 

The Senate Appropriations Committee has also recommended its FY 2013 spending numbers for the Department of Housing and Urban Development (HUD). The panel allocated $46 million for the Office of Policy Development and Research, rejecting the Administration's proposed $6 million increase over FY 2012. It also recommended no funds for the Doctoral Dissertation Research Program.

 

The Committee provided language in its report regarding its action: "The Committee supports the administration's focus on collecting and utilizing data to develop housing policy. However, in the current fiscal environment, priority must be given to programs that directly serve low-income Americans who rely on HUD programs. Given the budget reductions, the Committee encourages HUD to partner with other researchers to pursue valuable housing research opportunities."

The Committee continued to fund HUD's Transformation Initiative (TI), but at a significantly reduced level from the Administration's request; $43 million as compared to $120 million. This is also $7 million below last year. The TI has three elements: (1) research, evaluation, and program metrics; (2) program demonstrations; and (3) technical assistance and capacity building. Funding to support these activities is provided by transfers from HUD programs.

The Committee also added, in report language, some prescriptions on how HUD can use the funds: "Within the reduced level of funding provided, the Committee will allow HUD to determine the appropriate use of funding among the requested projects. However, the Committee continues to emphasize the importance of fully funding projects. The Committee expects the following projects, designed to improve program management or reduce costs, to be adequately funded: research on energy efficiency and utility costs, disaster resiliency focused on mitigating damage from disasters, the Moving to Work Evaluation, and Public and Indian Housing Integrated Technical Assistance focused on troubled Public Housing Authorities. The recommendation does not include funding for the Natural Experiments Grant Program or Demonstration and Related Small Grants."

National Geospatial Policy Focus of Legislation

The issue of a national geospatial policy has long been the focus of government at all levels.   The federal government recognized the need to organize and coordinate the collection and management of geospatial data since at least 1990, when the Office of Management and Budget (OMB) established the Federal Geographic Data Committee (FGDC) "to promote the coordinated use, sharing, and dissemination of geospatial data nationwide." In 1994, OMB specified that FGDC shall coordinate development of the National Spatial Data Infrastructure (NSDI).

 

Congress has also recognized the challenge of coordinating and sharing geospatial data and in these constrained budget times, it is scrutinizing the cost to the federal government of gathering and coordinating this information.   Rep. Doug Lanborn (R-CO), chairman of the Subcommittee on Energy and Mineral Resources of the House Natural Resources Committee, has introduced legislation (HR 4233) he calls the "Map it Once, Use it Many Times Act."

 

The legislation notes that: "Geospatial data is necessary and essential to (A) the management of natural resources; (B) economic development; (C) the management, adjudication, and prevention of future disruptions in the home mortgage system; (D) the development and implementation of a smart energy grid; (E) the deployment of universal domestic broadband service; (F) the management of Federal real property assets; (G) emergency preparedness and response; (H) homeland security; (I) the delivery of efficient health care and other services provided, financed, or regulated by the Federal Government; and (J) the maintenance, rehabilitation, and enhancement of public works, transportation, and other infrastructure of the United States." Furthermore: "The geospatial technology field is a high growth, high demand, and economically vital sector of the economy of the United States."

The Chairman's bill signifies concern that the FGDC is not working very well, that Federal agencies are not effectively using geospatial technologies, and efforts to reduce redundancies in geospatial investments have not been fully successful. The legislation claims that "Federal agencies are still independently acquiring and maintaining potentially duplicative and costly data sets and systems and until these problems are resolved, duplicative geospatial investments are likely to persist."

According to a summary by the Congressional Research Service, the legislation would establish a National Geospatial Policy Commission that would develop a National Geospatial Data Plan. It would require the Commission to identify in the plan each geospatial activity performed by the federal government that: (1) is unnecessary and provide for its elimination, or (2) may be converted to performance by a private geospatial firm or a state or local government.

 

It would also direct the Administrator of a National Geospatial Technology Administration within the United States Geological Survey (USGS) to develop: (1) a strategy for encouraging the use of private geospatial firms by federal agencies and other entities that receive federal funding, including foreign governments; (2) a Geospatial Research Plan to provide for U.S. investment in geospatial research and development activities; and (3) policy directives for the implementation of such activities.

 

On May 12, Lanborn's Subcommittee held a field hearing in Colorado Springs on the bill.   Steve Jennings, Associate Professor of Geography and Acting Chair of the Department of Geography and Environmental Studies at the University of Colorado, Colorado Springs, testified to the panel. The Association of American Geographers and the American Geosciences Institute endorsed Jennings' testimony.

 

Jennings, who also serves as Coordinator for the Colorado Geographic Alliance, which is part of a network of fifty state geographic alliances across the country, told the panel "there is no doubt that geospatial technologies and data are becoming increasingly vital to government agencies, non-profit organizations, colleges and universities, and the private sector." He endorsed the legislation's proposal to create a National Geospatial Technology Administration within the USGS, saying "that the time has come for our national government to have a division focused on geospatial technologies and data."

 

At the same time, Jennings expressed concern that the focus on the private sector in Lanborn's legislation "could ultimately stifle innovation and jeopardize the development of a vibrant future geospatial workforce." He argued that these provisions "would limit or even preclude government funding for geospatial activities with researchers, college and university consortia, non-profit organizations, and/or other public entities." This would also, Jennings indicated, "be an especially-troublesome development given the recent emphasis on the need to protect our nation's competitiveness by enhancing education efforts in STEM fields, including geography and geospatial education."

 

He also decried that only one representative from the university community would serve on the 18 member National Geospatial Policy Commission, indicating that the lack of higher-education representation "would limit the voice of the sector that is most-heavily focused on key geospatial research and training issues."

 

For more on the hearing go to: http://naturalresources.house.gov/Calendar/EventSingle.aspx?EventID=290854.

NSF Hosts Global Summit on Merit Review: International Scientific Collaboration Necessitates Common Principles

The National Science Foundation (NSF) invited the heads of science and engineering funding agencies from approximately 50 countries or regions to a Global Summit on Merit Review at its headquarters in Arlington in mid-May.   The purpose was to foster further international cooperation in science by developing a common "Statement of Principles on Merit Review."

 

The Summit was a follow-up to meetings held in October 2010 of the European Science Foundation and the European Heads of Research Councils and subsequent regional meetings around the world - in Brazil (for the Americas), South Africa (for Africa), India (for Asia and Australasia), Saudi Arabia (for North Africa and the Middle East) and Brussels (for Europe) - that attempted to integrate practices on merit review.

 

According to the Statement, there were two primary objectives in developing the principles. First, the worldwide agreement on core, high-level principles should foster international cooperation among funding agencies that support the scientific research community. According to a footnote, "scientific research can include the sciences, arts, and humanities." Second, for those countries that are developing new funding agencies, the principles provide a global consensus on the key elements necessary for a rigorous and transparent review system.

 

The following are the agreed upon principles:

 

Expert Assessment

Collectively, reviewers should have the appropriate knowledge and expertise to assess the proposal both at the level of the broad context of the research field(s) to which it contributes and with respect to the specific objectives and methodology. Reviewers should be selected according to clear criteria.

 

Transparency

Decisions must be based on clearly described rules, procedures and evaluation criteria that are published a priori. Applicants should receive appropriate feedback on the evaluation of their proposal.

 

Impartiality

Proposals must be assessed fairly and on their merit. Conflicts of interest must be declared and managed according to defined, published processes.

 

Appropriateness

The review process should be consistent with the nature of the call, with the research area addressed, and in proportion to the investment and complexity of the work.

Confidentiality

All proposals, including related data, intellectual property and other documents, must be treated in confidence by reviewers and organizations involved in the review process.

 

Integrity and Ethical Considerations

Ethics and integrity are paramount to the review process.

 

The participants agreed that in this new era of international cooperation and collaboration "rigorous and transparent scientific merit review helps to assure that government funding is appropriately expended on the most worthy projects to advance the progress of science and address societal challenges."

 

Department of Education Releases Annual "Condition of Education" Report

On May 24, the National Center for Education Statistics (NCES) released its annual report, The Condition of Education. This year's report presents 49 indicators of important developments and trends in U.S. education.

The indicators focus on participation in education, elementary and secondary education and its outcomes, and postsecondary education and its outcomes. The report also takes a closer look at high school in the United States over the last twenty years.

In the 2010-2011 academic year, approximately 49.5 million students were enrolled in public elementary and secondary schools. NCES projects that enrollment will increase by seven percent to 53.1 million students by 2021.

According to the report, the last two decades have seen tremendous changes in the racial population of U.S. public schools. Of the 12.5 million public high school students in 1995-96, some 67 percent were White, 16 percent were Black, and 12 percent were Hispanic. By 2010-11, the White student population had decreased to 56 percent; the Black population had a small increase to 17 percent. The Hispanic population, however, increased dramatically to 20 percent of the high school student population. By 2019-20, it is projected that public high school enrollment will be 53 percent White, 16 percent Black, and 23 percent Hispanic.

The Condition of Education report indicators show progress on the National Assessment of Educational Progress (NAEP) in reading and mathematics among 4th- and 8th-graders. Unfortunately, on both mathematics and reading assessments, significant gaps among racial and ethnic groups remain. However, the gaps for 4th grade mathematics and reading between White and Black students have narrowed since the assessments were first given.

The average grade 4 reading score in 2011 was not measurably different from that in 2009, while the average grade 8 grade score was one point higher in 2011 than in 2009. At grades 4 and 8 the average mathematics scores in 2011 were higher than the average scores for those grades in all previous assessment years. At grade 12, the score for the U.S. history assessment was two points higher in 2010 than in 1994, while the geography score was two points lower. There was no measurable difference in the civics score from 1998 to 2010.

The report also looks at indicators in higher education. Between 2000 and 2010, undergraduate enrollment in postsecondary institutions has increased by 37 percent, from 13.2 to 18.1 million students. NCES projects undergraduate enrollment to continue to increase, reaching 20.6 million students by 2021.   In the last decade the number of degrees earned by students also has increased by 50 percent for associate degrees, 33 percent for bachelor degrees, 50 percent for master degrees, and 34 percent for doctoral degrees. Graduate school enrollment has increased every year since 1983, reaching 2.9 million students in 2010. The Department projects enrollment for masters and doctorate programs to increase through 2021 to 3.5 million students.

The indicators show that women continue to make gains in attaining higher education degrees. Approximately 61 percent of female and 56 percent of male first-time, full-time students who sought a bachelor's degree at a four-year institution in fall 2004, completed their degree at that institution within six years. Continuing a trend, for all degree levels in 2009-2010, females earned the majority of degrees awarded. Since 1988, each year women have comprised more than half of post- baccalaureate enrollment. In 2009-10, females earned 60 percent of master's degrees and 52 percent of doctor's degrees awarded.

NCES produces reports each year that present findings from the NAEP data and the U.S. education system. For all 49 indicators and the complete Condition of Education 2012 report, go to: http://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2012045.

Education Policy Center Report Examines Student Motivation

A series of papers released on May 22 by the Center on Education Policy (CEP) underscores the need for teachers, schools, parents and communities to pay more attention to student motivation in school reform efforts. The CEP report, Student Motivation-An Overlooked Piece of School Reform offers six background papers that examine:

  • What Is Motivation and Why Does It Matter?
  • Can Money and Other Rewards Motivate Students?
  • Can Goals Motivate Students?
  • What Roles Do Parents, Family Background, and Culture Play in Student Motivation?
  • What Can Schools Do To Better Motivate Students?
  • What Nontraditional Approaches to Learning Can Motivate Unenthusiastic Students?

Research shows motivation is crucial for students' success since it can affect how they approach school, how they relate to teachers, and how much time and effort they devote to their studies.

Schools, teachers and administrators play an important role in student motivation. Teachers can affect motivation through their interactions with students, their assignments and tests, and their classroom climate. The paper What Can Schools Do To Better Motivate Students?, gives several examples of what schools can do to motivate students to achieve including: adopting programs that encourage students to view postsecondary education as a goal; institute programs to help provide low-income parents with information and resources; adopt programs to identify potential dropouts and other students who show signs of low motivation; and provide professional development to teachers on encouraging student motivation.

Studies have also shown a critical link between parent involvement and their child's educational development and subsequent success in school. The paper What Roles Do Parents, Family Background, and Culture Play in Student Motivation?, offers ideas for parents and families to consider such as: taking an active interest in your children's education and talking to your child's teachers or school about programs to help parents become partners in learning.

As with other reform efforts, communities are urged play a bigger role by establishing extracurricular clubs and other activities outside of school to help foster student interest in academics and provide them with ways to demonstrate their competence. Communities should also offer support to students by providing mentoring, scholarships, and information about college requirements to encourage college attendance as a goal.

Motivation remains a critical part of a student's academic success, but the value of motivation is often overlooked in current school reform efforts. All six papers take the view that more research is necessary to determine how best to use motivation to help all students succeed.

The full report can be found at: http://www.cep-dc.org/displayDocument.cfm?DocumentID=405.

No Credible Findings Regarding Research on Deterrence and the Death Penalty According to NAS Panel

 

In October 2010, the National Academy of Sciences' (NAS) Committee on Law and Justice convened a panel to begin a study that would "assess the evidence on the deterrent effect of the death penalty; i.e. whether the threat of execution prevents homicides." There had been "differing conclusions in more recent empirical studies about the effects of the legal status and actual practice of the death penalty on criminal homicide rates," and the committee wanted to figure out why.   Daniel Nagin of Carnegie Mellon University chaired the panel. He had co-authored, with Al Blumstein and Jacqueline Cohen of Carnegie Mellon, an earlier 1978 NAS report on the same subject.

 

The current Committee held six meetings, including workshops with researchers (see Update, May 16, 2011) and recently released its report.   The panel's major conclusion: "Research to date on the effect of capital punishment on homicide is not informative about whether capital punishment decreases, increases, or has no effect on homicide rates. Therefore, the committee recommends that these studies not be used to inform deliberations requiring judgments about the effect of the death penalty on homicide. Consequently, claims that research demonstrates that capital punishment decreases or increases the homicide rate by a specified amount or has no effect on the homicide rate, should not influence policy judgments about capital punishment."

 

Acknowledging their disappointment in reaching such a judgment, the Committee explained that "the relevant question about the deterrent effect of capital punishment is the differential or marginal deterrent effect of execution over the deterrent effect of other available or commonly used penalties, specifically, a lengthy prison sentence or one of life without the possibility of parole. One major deficiency in all the existing studies is that none specify the noncapital sanction components of the sanction regime for the punishment of homicide. Another major deficiency is the use of incomplete or implausible models of potential murderers' perceptions of and response to the capital punishment component of a sanction regime. Without this basic information, it is impossible to draw credible findings about the effect of the death penalty on homicide."

 

The report further noted that: "A lack of evidence is not evidence for or against the hypothesis. Hence, the committee does not construe its conclusion that the existing studies are uninformative as favoring one side or the other side in the long-standing debate about deterrence and the death penalty. The committee also emphasizes that deterrence is but one of many considerations relevant to rendering a judgment on whether the death penalty is good public policy."

 

Finally, the Committee emphasized its rather limited charge-evidence on the deterrence effect of the death penalty. Due to this charge, it did not investigate the moral arguments for or against capital punishment, or the empirical evidence on whether states' administer capital punishment in a nondiscriminatory and consistent fashion. Nor did it investigate whether the risk of mistaken execution is acceptably small or how the cost of administering the death penalty compares to other sanction alternatives. Nonetheless, the panel concluded:   "All of these issues are relevant to making a judgment about whether the death penalty is good public policy."

 

The full report is available at: http://www.nap.edu/catalog.php?record_id=13363.

 
Consortium of Social Science Associations 
Members

Governing Members

American Association for Public Opinion Research
American Economic Association
American Educational Research Association
American Historical Association
American Political Science Association
American Psychological Association
American Society of Criminology
American Sociological Association
American Statistical Association
Association of American Geographers
Association of American Law Schools
Law and Society Association
Linguistic Society of America
Midwest Political Science Association
National Communication Association
Population Association of America
Society for Research in Child Development

 
 
Membership Organizations

Academy of Criminal Justice Sciences
American Finance Association
American Psychosomatic Society
Association for Asian Studies
Association for Public Policy Analysis and Management
Association of Academic Survey Research Organizations
Association of Research Libraries
Council on Social Work Education
Economic History Association
History of Science Society
Justice Research and Statistics Association
Midwest Sociological Society
National Association of Social Workers
North American Regional Science Council
North Central Sociological Association
Rural Sociological Society
Social Science History Association
Society for Anthropological Sciences
Society for Behavioral Medicine
Society for Empirical Legal Studies
Society for Research on Adolescence
Society for Social Work and Research
Society for the Psychological Study of Social Issues
Southern Political Science Association
Southern Sociological Society
Southwestern Social Science Association


Centers and Institutes

American Academy of Political and Social Sciences
American Council of Learned Societies
American Institutes for Research
Brookings Institution
Center for Advanced Study in the Behavioral Sciences
Cornell Institute for Social and Economic Research
Institute for Social Research, University of Michigan
Institute for Women's Policy Research
National Opinion Research Center
Population Reference Bureau
RTI International
Social Science Research Council
Vera Institute of Justice
Colleges and Universities

Arizona State University
Boston University
Brown University
University of California, Berkeley
University of California, Los Angeles
University of California, San Diego
University of California, Santa Barbara
Carnegie-Mellon University
University of Chicago
Clark University
University of Colorado
Columbia University
University of Connecticut
Cornell University
University of Delaware
Duke University
Georgetown University
George Mason University
George Washington University
Harvard University
Howard University
University of Illinois
Indiana University
Iowa State University
Johns Hopkins University
John Jay College of Criminal Justice, CUNY
University of Maryland
Massachusetts Institute of Technology
Maxwell School of Citizenship and Public Affairs, Syracuse
University of Michigan
Michigan State University
University of Missouri, St. Louis 
University of Minnesota  
Mississippi State University
University of Nebraska, Lincoln
New York University
University of North Carolina, Chapel Hill
North Dakota State University
Northwestern University
Ohio State University
University of Oklahoma
University of Pennsylvania
Pennsylvania State University
Princeton University
Rutgers, The State University of New Jersey
University of South Carolina
Stanford University
State University of New York, Stony Brook
University of Texas, Austin
University of Texas, San Antonio
Texas A & M University
Tulane University
Vanderbilt University
University of Virginia
University of Washington
Washington University in St. Louis
University of Wisconsin, Madison
University of Wisconsin, Milwaukee
Yale University
 

COSSA 
 

 
 
Executive Director:  Howard J. Silver
Deputy Director:  Angela L. Sharpe
Assistant Director for Government Affairs:  LaTosha C. Plavnik
Assistant Director of Public Affairs: Gina Drioane 

 
President:  Kenneth Prewitt 

  

Address all inquiries to COSSA at [email protected]  Telephone: (202) 842-3525


 

The Consortium of Social Science Associations (COSSA) is an advocacy organization promoting attention to and federal support for the social and behavioral sciences.

 
 
UPDATE is published 22 times per year.  ISSN 0749-4394.