This benchmark volume addresses the debate over the effects of early industrialization on standards of living during the decades before the Civil War. Its contributors demonstrate that the aggregate antebellum economy was growing faster than any other large economy had grown before.
Despite the dramatic economic growth and rise in income levels, questions remain as to the general quality of life during this era. Was the improvement in income widely shared? How did economic growth affect the nature of work? Did higher levels of income lead to improved health and longevity? The authors address these questions by analyzing new estimates of labor force participation, real wages, and productivity, as well as of the distribution of income, height, and nutrition.
Trends in Archives Practice by the Society of American Archivists is a new, open-ended series of modules featuring brief, authoritative treatments — written and edited by top-level professionals — that fill significant gaps in archival literature. The goal of this modular approach is to build agile, user-centered resources. Each module will treat a discrete topic relating to the practical management of archives and manuscript collections in the digital age.
The first three modules address archival arrangement and description and are designed to complement Kathleen D. Roe's book, ARRANGING AND DESCRIBING ARCHIVES AND MANUSCRIPTS. They include:
STANDARDS FOR ARCHIVAL DESCRIPTION
Sibyl Schaefer and Janet M. Bunde
Untangles the history of standards development and provides an overview of descriptive standards that an archives might wish to use.
PROCESSING DIGITAL RECORDS AND MANUSCRIPTS
J. Gordon Daines III
Builds on familiar terminology and models to show how any repository can take practical steps to process born-digital materials and to make them accessible to users.
DESIGNING DESCRIPTIVE AND ACCESS SYSTEMS
Daniel A. Santamaria
Implementation advice regarding the wide range of tools and software that support specific needs in arranging, describing, and providing access to analog and digital archival materials.
"In a much-needed intervention, Ric McIntyre recasts the debate about globalization and labor rights and speeds us to the heart of the matter: the battle between transnational corporations who distance themselves from responsibility for the fate of workers, and labor activists who seek to reestablish bonds of accountability and moral obligation. The stakes in this struggle are enormous, and Dr. McIntyre provides crucial insight into the economic and political dynamics that define it."
---Scott Nova, Executive Director, Worker Rights Consortium, Washington, DC
"This book presents an insightful, powerful corrective to the contemporary debate over worker rights. McIntyre identifies the limitations of thinking of worker rights as individualized human rights and challenges us instead to examine how rights are defined through conventional thinking and class interest. The product is rich and compelling: McIntyre's investigation demands of us that we be far more attentive to the contradictory effects of ‘rights talk.' I recommend this book enthusiastically to all those who advocate for a just economic order the world over."
---George DeMartino, Associate Professor of Political Economy, the Josef Korbel School of International Studies, University of Denver
"An important contribution to the interdisciplinary study of labor. McIntyre's book will challenge the debate over labor rights on all fronts."
---Michael Hillard, Professor of Economics, University of Southern Maine
"A timely examination of our modern 'sweating system' . . . essential reading for all workers who hope for greater dignity in the workplace and greater fairness in society."
---Janet Knoedler, Associate Professor of Economics, Bucknell University
"Ric McIntyre convincingly shows how local actions, regulations changes, and international norms can combine to establish collective rights for workers."
---Gilles Raveaud, Assistant Professor in Economics, University of Saint-Denis, France, and cofounder of the "post-autistic economics movement"
"An important, timely, and needed contribution to our understanding of worker rights."
---Patrick McHugh, Associate Professor of Management, George Washington University
"Workers of the world, unite!" Karl Marx's famous call to action still promises an effective means of winning human rights in the modern global economy, according to economist Richard P. McIntyre. Currently, the human rights movement insists upon a person's right to life, freedom, and material necessities. In democratic, industrial nations such as the United States, the movement focuses more specifically on a person's civil rights and equal opportunity.
The movement's victories since WWII have come at a cost, however. The emphasis on individual rights erodes collective rights---the rights that disadvantaged peoples need to assert their most basic human rights. This is particularly true for workers, McIntyre argues. By reintroducing Marxian and Institutional analysis, he reveals the class relations and power structures that determine the position of workers in the global economy. The best hope for achieving workers' rights, he concludes, lies in grassroots labor organizations that claim the right of association and collective bargaining.
At last, an economist offers a vision for human rights that takes both moral questions and class relations seriously.
Richard P. McIntyre is Director of the University Honors Program and Professor of Economics at the University of Rhode Island.
This Code of Practice sets out the requirements for the design, specification, installation, commissioning, operation and maintenance of grid-connected solar photovoltaic (PV) systems installed in the UK. It is aimed at ensuring safe, effective and competently installed solar PV systems.
SATs, ACTs, GPAs. Everyone knows that these scores can’t tell a college everything that’s important about an applicant. But what else should admissions officers look for, and how can they know it when they see it? In College Admissions for the 21st Century a leading researcher on intelligence and creativity offers a bold and practical approach to college admissions testing.
Standardized tests are measures of memory and analytical skills. But the ever-changing global society beyond a college campus needs more than just those qualities, argues Robert Sternberg. Tomorrow’s leaders and citizens also need creativity, practicality, and wisdom.
How can the potential for those complex qualities be measured? One answer is “Kaleidoscope,” a new initiative in undergraduate admissions, first used at Tufts University. Its open-ended questions for applicants, and the means used to score the answers, gives applicants and admissions officers the chance to go beyond standardized tests.
Does it work? As Sternberg describes in detail, Kaleidoscope measures predicted first-year academic success, over and above SATs and high school GPAs, and predicted first-year extracurricular activities, leadership, and active citizenship as well. And every year that Kaleidoscope measures were used, the entering class’s average SATs and high school GPAs went up too.
What worked at Tufts can work elsewhere. New kinds of assessments, like Kaleidoscope, can liberate many colleges and students from the narrowness of standardized tests and inspire new approaches to teaching for new kinds of talented, motivated citizens of the world.
EMI Troubleshooting Cookbook for Product Designers provides the 'recipe' for identifying why products fail to meet EMI/EMC regulatory standards. It also outlines techniques for tracking the noise source, and discovering the coupling mechanism, that is causing the undesired effects.
Choosing a psychiatrist is complicated. If a person doesn’t know what to look for and the questions to ask, finding the right psychiatrist can be daunting.The goal is to find one who, while remaining a competent physician, is as comfortable and capable working with problems of the mind as he or she is prescribing psychiatric medications.
Combining over forty years of experience as a practicing psychiatrist with an insider’s perspective of current psychiatric practice, Dr. Robert Taylor provides invaluable guidance to persons considering psychiatric treatment or contemplating a change of doctor in an effort to find better treatment. Cautioning readers against settling for a psychiatrist who views psychodrugs as the treatment, Dr. Taylor provides specific suggestions for avoiding the growing number of psychiatrists who write scripts automatically.
In recent decades, psychiatric care has been overly reliant on psychodrugs. Patient diagnoses are being seriously questioned. Finding the Right Psychiatrist encourages people to seek care from a complete psychiatrist—one able and willing to pursue matters of mind and brain/body, rather than settling on psychodrugs as the main treatment.
Throughout the book, readers learn about the proper uses and limits of psychiatric diagnosis. Dr. Taylor carefully outlines an individualized approach to psychiatric care guided more by a patient’s particular problems and situation than by diagnoses that often mislead more than help. He provides a realistic appraisal of psychiatric medications: what they can and cannot do as well, a discussion of mind work tools, traits of effective psychiatrists, suggestions for how to deal with common insurance company obstacles, and an explanation of the confusing politics of psychiatry.
An indispensable resource for anyone seeking psychiatric help or tasked with advising someone of what to look for in a doctor, Finding the Right Psychiatrist gives hope and guidance to those searching for complete and personalized care.
As the economies of China, India, and other Asian nations continue to grow, these countries are seeking greater control over the rules that govern international trade. Setting the rules carries with it the power to establish advantage, so it’s no surprise that everyone wants a seat at the table—or that negotiations over rules often result in stalemates at meeting of the World Trade Organization.
Nowhere is the conflict over rule setting more evident than in the simmering “standards wars” over the rules that define quality and enable the adjudication of disputes. In Global Rivalries, Amy A. Quark explores the questions of how rules are made, who makes them, and how they are enforced, using the lens of cotton—a simple commodity that has become a poignant symbol of both the crisis of Western rule making power and the potential for powerful new rivals to supplant it. Quark traces the strategies for influencing rule making processes employed not only by national governments but also by transnational corporations, fiber scientists, and trade associations from around the globe. Quark analyzes the efficacy of their approaches and the implications for more marginal actors in the cotton trade, including producers in West Africa.
By placing the current contest within the historical development of the global capitalist system, Global Rivalries highlights a fascinating interaction of politics and economics.
Few things make people react more strongly to the changes going on in health care than the word standardization. Critics shudder at the mindless sameness of standards, while supporters dream of a world in which standardized "best practices" open up a world of efficient health care delivery. The Gold Standard takes up this debate to investigate the real meaning of standardization and how it affects patients, doctors, and the institution of medicine.Showing that standards are not about less or more skills, or more or less uniformity, but rather about a redefinition of autonomy, patients, and relationships, Timmermans and Berg show instead that they are about creating new worlds of medical treatment. Cutting through the hype and fears, the authors show where the true powers of standardization lie. The Gold Standard will become a classic for students of medicine and health care policy, and will be a welcome book for anyone concerned with the future of our system of care.
Performance accountability has been the dominant trend in education policy reform since the 1970s. State and federal policies set standards for what students should learn; require students to take “high-stakes” tests to measure what they have learned; and then hold students, schools, and school districts accountable for their performance. The goal of these policies is to push public school districts to ensure that all students reach a common threshold of knowledge and skills.
High-Stakes Reform analyzes the political processes and historical context that led to the enactment of state-level education accountability policies across the country. It also situates the education accountability movement in the broader context of public administration research, emphasizing the relationships among equity, accountability, and intergovernmental relations. The book then focuses on three in-depth case studies of policy development in Massachusetts, New Jersey, and Connecticut. Kathryn McDermott zeroes in on the most controversial and politically charged forms of state performance accountability sanctions, including graduation tests, direct state intervention in or closing of schools, and state takeovers of school districts.
Public debate casts performance accountability as either a cure for the problems of US public education or a destructive mistake. Kathryn McDermott expertly navigates both sides of the debate detailing why particular policies became popular, how the assumptions behind the policies influenced the forms they took, and what practitioners and scholars can learn from the successes and failures of education accountability policies.
The United States has a long and unfortunate history of exposing employees, the public, and the environment to dangerous work. But in April 2009, the spotlight was on Las Vegas when the Pulitzer committee awarded its public service prize to the Las Vegas Sun for its coverage of the high fatalities on Las Vegas Strip construction sites. The newspaper attributed failures in safety policy to the recent “exponential growth in the Las Vegas market.” In fact, since Las Vegas’ founding in 1905, rapid development has always strained occupational health and safety standards.
A History of Occupational Health and Safety examines the work, hazards, and health and safety programs from the early building of the railroad through the construction of the Hoover Dam, chemical manufacturing during World War II, nuclear testing, and dense megaresort construction on the Las Vegas Strip. In doing so, this comprehensive chronicle reveals the long and unfortunate history of exposing workers, residents, tourists, and the environment to dangerous work—all while exposing the present and future to crises in the region. Complex interactions and beliefs among the actors involved are emphasized, as well as how the medical community interpreted and responded to the risks posed.
Few places in the United States contain this mixture of industrial and postindustrial sites, the Las Vegas area offers unique opportunities to evaluate American occupational health during the twentieth century, and reminds us all about the relevancy of protecting our workers.
Nations use product standards, and manipulate them, for reasons othen than practical use or safety. The Soviets once cultivated standards to isolate themselves. In the United States, codes and standards are often used to favor home industries over external competition, and to favor some producers over others. Krislov compares and contrasts the United States, the EC, the forner Eastern bloc, and Japan, to link standard choice with political styles and to trace growing internationalization based on product efficiency criteria.
Since Oliver’s guide was first published in 2010, thousands of LIS students, records managers, and catalogers and other library professionals have relied on its clear, plainspoken explanation of RDA: Resource Description and Access as their first step towards becoming acquainted with the cataloging standard. Now, reflecting the changes to RDA after the completion of the 3R Project, Oliver brings her Special Report up to date. This essential primer
concisely explains what RDA is, its basic features, and the main factors in its development;
describes RDA’s relationship to the international standards and models that continue to influence its evolution;
provides an overview of the latest developments, focusing on the impact of the 3R Project, the results of aligning RDA with IFLA’s Library Reference Model (LRM), and the outcomes of internationalization;
illustrates how information is organized in the post 3R Toolkit and explains how to navigate through this new structure; and
discusses how RDA continues to enable improved resource discovery both in traditional and new applications, including the linked data environment.
Offering the first broadly comparative analysis of place-based labeling and marketing systems, Knowing Where It Comes From examines the way claims about the origins and meanings of traditional foods get made around the world, from Italy and France to Costa Rica and Thailand. It also highlights the implications of different systems for both producers and consumers.
Labeling regimes have moved beyond intellectual property to embrace community-based protections, intangible cultural heritage, cultural landscapes, and indigenous knowledge. Reflecting a rich array of juridical, regulatory, and activist perspectives, these approaches seek to level the playing field on which food producers and consumers interact.
Generating over $12 billion in annual sales, kosher food is big business. It is also an unheralded story of successful private-sector regulation in an era of growing public concern over the government’s ability to ensure food safety. Kosher uncovers how independent certification agencies rescued American kosher supervision from fraud and corruption and turned it into a model of nongovernmental administration.
Currently, a network of over three hundred private certifiers ensures the kosher status of food for over twelve million Americans, of whom only eight percent are religious Jews. But the system was not always so reliable. At the turn of the twentieth century, kosher meat production in the United States was notorious for scandals involving price-fixing, racketeering, and even murder. Reform finally came with the rise of independent kosher certification agencies which established uniform industry standards, rigorous professional training, and institutional checks and balances to prevent mistakes and misconduct.
In overcoming many of the problems of insufficient resources and weak enforcement that hamper the government, private kosher certification holds important lessons for improving food regulation, Timothy Lytton argues. He views the popularity of kosher food as a response to a more general cultural anxiety about industrialization of the food supply. Like organic and locavore enthusiasts, a growing number of consumers see in rabbinic supervision a way to personalize today’s vastly complex, globalized system of food production.
A little-discussed aspect of the No Child Left Behind Act (NCLB) is a mandate that requires failing schools to hire after-school tutoring companies—the largest of which are private, for-profit corporations—and to pay them with federal funds. Making Failure Pay takes a hard look at the implications of this new blurring of the boundaries between government, schools, and commerce in New York City, the country’s largest school district.
As Jill P. Koyama explains in this revelatory book, NCLB—a federally legislated, state-regulated, district-administered, and school-applied policy—explicitly legitimizes giving private organizations significant roles in public education. Based on her three years of ethnographic fieldwork, Koyama finds that the results are political, problematic, and highly profitable. Bringing to light these unproven, unregulated private companies’ almost invisible partnership with the government, Making Failure Pay lays bare the unintended consequences of federal efforts to eliminate school failure—not the least of which is more failure.
Since it was first published, LIS students and professionals everywhere have relied on Miller’s authoritative manual for clear instruction on the real-world practice of metadata design and creation. Now the author has given his text a top to bottom overhaul to bring it fully up to date, making it even easier for readers to acquire the knowledge and skills they need, whether they use the book on the job or in a classroom. By following this book’s guidance, with its inclusion of numerous practical examples that clarify common application issues and challenges, readers will
learn about the concept of metadata and its functions for digital collections, why it’s essential to approach metadata specifically as data for machine processing, and how metadata can work in the rapidly developing Linked Data environment;
know how to create high-quality resource descriptions using widely shared metadata standards, vocabularies, and elements commonly needed for digital collections;
become thoroughly familiarized with Dublin Core (DC) through exploration of DCMI Metadata Terms, CONTENTdm best practices, and DC as Linked Data;
discover what Linked Data is, how it is expressed in the Resource Description Framework (RDF), and how it works in relation to specific semantic models (typically called “ontologies”) such as BIBFRAME, comprised of properties and classes with “domain” and “range” specifications;
get to know the MODS and VRA Core metadata schemes, along with recent developments related to their use in a Linked Data setting;
understand the nuts and bolts of designing and documenting a metadata scheme; and
gain knowledge of vital metadata interoperability and quality issues, including how to identify and clean inconsistent, missing, and messy metadata using innovative tools such as OpenRefine.
"Catherine Caufield has written an important book on an important topic:
the history behind the safety standards limiting the effects of high energy
radiation on human beings. . . . Provides an immense amount of information
in a very readable form."—W. Alan Runciman, Prometheus
"From fallout and radon to radioactive smoke detectors and dental X-rays,
Caufield traces the proliferation of the uses of radiation in medicine,
industry and the military, and in generating energy. An intelligent,
non-alarmist history."—Publishers Weekly
Achievement tests play an important role in modern societies. They are used to evaluate schools, to assign students to tracks within schools, and to identify weaknesses in student knowledge. The GED is an achievement test used to grant the status of high school graduate to anyone who passes it. GED recipients currently account for 12 percent of all high school credentials issued each year in the United States. But do achievement tests predict success in life?
The Myth of Achievement Tests shows that achievement tests like the GED fail to measure important life skills. James J. Heckman, John Eric Humphries, Tim Kautz, and a group of scholars offer an in-depth exploration of how the GED came to be used throughout the United States and why our reliance on it is dangerous. Drawing on decades of research, the authors show that, while GED recipients score as well on achievement tests as high school graduates who do not enroll in college, high school graduates vastly outperform GED recipients in terms of their earnings, employment opportunities, educational attainment, and health. The authors show that the differences in success between GED recipients and high school graduates are driven by character skills. Achievement tests like the GED do not adequately capture character skills like conscientiousness, perseverance, sociability, and curiosity. These skills are important in predicting a variety of life outcomes. They can be measured, and they can be taught.
Using the GED as a case study, the authors explore what achievement tests miss and show the dangers of an educational system based on them. They call for a return to an emphasis on character in our schools, our systems of accountability, and our national dialogue.
Eric Grodsky, University of Wisconsin–Madison
Andrew Halpern-Manners, Indiana University Bloomington
Paul A. LaFontaine, Federal Communications Commission
Janice H. Laurence, Temple University
Lois M. Quinn, University of Wisconsin–Milwaukee
Pedro L. Rodríguez, Institute of Advanced Studies in Administration
John Robert Warren, University of Minnesota, Twin Cities
“A powerful, detailed, and exceptionally balanced critique of NCLB. It offers some hope for how we might overcome its faults. No legislator or educational expert should be allowed to get away with not reading it—whether to agree or disagree. It’s a must learning experience.”
—Deborah Meier, Senior Scholar and Adjunct Professor, Steinhardt School of Education, New York University, and author of In Schools We Trust
“A concise, highly readable, and balanced account of NCLB, with insightful and realistic suggestions for reform. Teachers, professors, policymakers, and parents—this is the one book about NCLB you ought to read.”
—James E. Ryan, William L. Matheson and Robert M. Morgenthau Distinguished Professor, University of Virginia School of Law
This far-reaching new study looks at the successes and failures of one of the most ambitious and controversial educational initiatives since desegregation—the No Child Left Behind Act of 2001.
NCLB’s opponents criticize it as underfunded and unworkable, while supporters see it as a radical but necessary educational reform that evens the score between advantaged and disadvantaged students. Yet the most basic and important question remains unasked: “Can we ever really know if a child’s education is good?”
Ultimately, Scott Franklin Abernathy argues, policymakers must begin from this question, rather than assuming that any test can accurately measure the elusive thing we call “good” education.
American schools have always been locally created and controlled. But ever since the Title I program in 1965 appropriated nearly one billion dollars for public schools, federal money and programs have been influencing every school in America.What has been accomplished in this extraordinary assertion of federal influence? What hasn’t? Why not? With incisive clarity and wit, David K. Cohen and Susan L. Moffitt argue that enormous gaps existed between policies and programs and the real-world practices that they attempted to change. Learning and teaching are complicated and mysterious. So the means to achieve admirable goals are uncertain, and difficult to develop and sustain, particularly when teachers get little help to cope with the blizzard of new programs, new slogans, new tests, and new rules.Ironically, as the authors observe, the least experienced and least well-trained teachers are often in the most needy schools, so federal support “is compromised by the inequality it is intended to ameliorate.” If new policies and programs don’t include means to create the capability they require, they cannot succeed. We don’t know what we need to enable states, school systems, schools, teachers, and students to use the resources that programs offer. The trouble with standards-based reform is that standards and tests still don’t teach you how to teach.
Engaging education policy from kindergarten to college
Author Tyler S. Branson argues that education reform initiatives in the twentieth century can be understood in terms of historical shifts in the ideas, interests, and governing arrangements that inform the teaching of writing. Today, policy regimes of “accountability” shape education reform programs such as Common Core in K-12 and Dual Enrollment in postsecondary institutions. This book reopens the conversation between policy makers and writing teachers, empirically describing the field’s institutional/historical relationship to policy and the ways teachers work on a daily basis to carry out policy. Federal and state accountability policy significantly shapes classrooms before teachers even enter them, but Branson argues the classroom is where teachers leverage disciplinary knowledge about writing to bridge, partner with, support, and sometimes resist education policies.
Branson deftly blends policy critique, archival analysis, and participant observation to offer the first scholarly treatment of the National Council of Teachers of English (NCTE) Washington Task Force as well as a rare empirical study of a dual enrollment course offered in a high school. This book’s macro-and-micro-level analysis of education policy reveals how writing teachers, researchers, and administrators can strengthen their commitments to successfully teaching their students across all levels of education, while deepening their understanding of the ways education policy helps—and hinders—those commitments.
In a story of reform and backlash, Lorraine McDonnell reveals the power and the dangers of policies based on appeals to voters' values. Exploring the political struggles inspired by mass educational tests, she analyzes the design and implementation of statewide testing in California, Kentucky, and North Carolina in the 1990s.
Educational reformers and political elites sought to use test results to influence teachers, students, and the public by appealing to their values about what schools should teach and offering apparently objective evidence about whether the schools were succeeding. But mass testing mobilized parents who opposed and mistrusted the use of tests, and left educators trying to mediate between angry citizens and policies the educators may not have fully supported. In the end, some testing programs were significantly altered. Yet despite the risks inherent in relying on values to change what students are taught, these tests and the educational ideologies behind them have modified classroom practice.
McDonnell draws lessons from these stories for the federal No Child Left Behind act, with its sweeping directives for high-stakes testing. To read this book is to witness the unfolding drama of America's educational culture wars, and to see hope for their resolution.
IEC 1131-3 is the international standard for the design of software for industrial control systems. It defines a set of related graphical and textual languages that bring significant benefits throughout the control system life-cycle - benefits for system integrators, control system engineers and end-users alike. The standard marks the beginning for wellstructured, re-usable and maintainable software for industrial control systems.
Putting Descriptive Standards to Work, edited by Kris Kiesling and Christopher J. Prom, is the most recent addition to SAA’s Trends in Archives Practice series. The book consists of four modules: Module 17: Implementing DACS: A Guide to the Archival Content Standard by Cory L. Nimer, lead archivists through the provisions of Describing Archives: A Content Standard (DACS); Module 18: Using EAD3, by Kelcy Shepherd, introduces Encoded Archival Description Version EAD3; Module 19: Introducing EAC-CPF by Katherine M. Wisser, introduces Encoded Archival Context–Corporate Bodies, Persons, and Families (EAC-CPF); and Module 20: Sharing Archival Metadata, by Aaron Rubinstein, explores strategies for sharing archival metadata with researchers in the digital humanities and other archivists.
What happens to federal and state policies as they move from legislative chambers to individual districts, schools, and, ultimately, classrooms? Although policy implementation is generally seen as an administrative problem, James Spillane reminds us that it is also a psychological problem.
After intensively studying several school districts' responses to new statewide science and math teaching policies in the early 1990s, Spillane argues that administrators and teachers are inclined to assimilate new policies into current practices. As new programs are communicated through administrative levels, the understanding of them becomes increasingly distorted, no matter how sincerely the new ideas are endorsed. Such patterns of well-intentioned misunderstanding highlight the need for systematic training and continuing support for the local administrators and teachers who are entrusted with carrying out large-scale educational change, classroom by classroom.
The third book in the Museum’s Occasional Contributions series is Benjamin March’s effort to standardize descriptions of pottery. Standards were needed so that pottery collections at various museums or institutions could be compared. With an introduction by Carl E. Guthe.
In Standards of Value, Michael Germana reveals how tectonic shifts in U.S. monetary policy—from the Coinage Act of 1834 to the abolition of the domestic gold standard in 1933–34—correspond to strategic changes by American writers who renegotiated the value of racial difference. Populating the pages of this bold and innovative study are authors as varied as Harriet Beecher Stowe, George Washington Cable, Charles Chesnutt, James Weldon Johnson, Nella Larsen, Jessie Redmon Fauset, and Ralph Ellison—all of whom drew analogies between the form Americans thought the nation's money should take and the form they thought race relations and the nation should take.
A cultural history of race organized around and enmeshed within the theories of literary and monetary value, Standards of Value also recovers a rhetorical tradition in American culture whose echoes can be found in the visual and lyrical grammars of hip hop, the paintings of John W. Jones and Michael Ray Charles, the cinematography of Spike Lee, and many other contemporary forms and texts.
This reconsideration of American literature and cultural history has implications for how we value literary texts and how we read shifting standards of value. In vivid prose, Germana explains why dollars and cents appear where black and white bodies meet in American novels, how U.S. monetary policy gave these symbols their cultural currency, and why it matters for scholars of literary and cultural studies.
The topic of streets and street design is of compelling interest today as public officials, developers, and community activists seek to reshape urban patterns to achieve more sustainable forms of growth and development. Streets and the Shaping of Towns and Cities traces ideas about street design and layout back to the early industrial era in London suburbs and then on through their institutionalization in housing and transportation planning in the United States. It critiques the situation we are in and suggests some ways out that are less rigidly controlled, more flexible, and responsive to local conditions.
Originally published in 1997, this edition includes a new introduction that addresses topics of current interest including revised standards from the Institute of Transportation Engineers; changes in city plans and development standards following New Urbanist, Smart Growth, and sustainability principles; traffic calming; and ecologically oriented street design.
This book provides the reader with a clear understanding of the application of both BS 7671 and BS 7909 to events and other relevant industry guidance. It is an indispensable guide for all those working with any temporary power system including agricultural shows and outdoors fairs, concerts, theatrical events, film and TV broadcasting, exhibitions, festivals as well as temporary buildings and structures.
A standard track gauge—the distance between the two rails—enables connecting railway lines to exchange traffic. But despite the benefits of standardization, early North American railways used six different gauges extensively, and even today breaks of gauge at national borders and within such countries as India and Australia are expensive burdens on commerce. In Tracks across Continents, Paths through History, Douglas J. Puffert offers a global history of railway track gauge, examining early choices and the dynamic process of diversity and standardization that resulted.
Drawing on the economic theory of path dependence, and grounded in economic, technical, and institutional realities, this innovative volume traces how early historical events, and even idiosyncratic personalities, have affected choices of gauge ever since, despite changing technology and understandings of what gauge is optimal. Puffert also uses this history to develop new insights in the theory of path dependence. Tracks across Continents, Paths through History will be essential reading for anyone interested in how history and economics inform each other.
Tuning the World tells the unknown story of how the musical pitch A 440 became the global norm.
Now commonly accepted as the point of reference for musicians in the Western world, A 440 hertz only became the standard pitch during an international conference held in 1939. The adoption of this norm was the result of decades of negotiations between countries, involving a diverse group of performers, composers, diplomats, physicists, and sound engineers. Although there is widespread awareness of the variability of musical pitches over time, as attested by the use of lower frequencies to perform early music repertoires, no study has fully explained the invention of our current concert pitch. In this book, Fanny Gribenski draws on a rich variety of previously unexplored archival sources and a unique combination of musicological perspectives, transnational history, and science studies to tell the unknown story of how A 440 became the global norm. Tuning the World demonstrates the aesthetic, scientific, industrial, and political contingencies underlying the construction of one of the most “natural” objects of contemporary musical performance and shows how this century-old effort was ultimately determined by the influence of a few powerful nations.