Making software what really works, and why we believe it

Many claims are made about how certain tools, technologies, and practices improve software development. But which claims are verifiable, and which are merely wishful thinking? In this book, leading thinkers such as Steve McConnell, Barry Boehm, and Barbara Kitchenham offer essays that uncover the t...

Full description

Bibliographic Details
Other Authors: Oram, Andy, author (author), Oram, Andrew, editor (editor), Wilson, Greg, editor
Format: eBook
Language:Inglés
Published: Sebastopol : O'Reilly [2011]
Edition:1st ed
Series:Theory in practice
Subjects:
See on Biblioteca Universitat Ramon Llull:https://discovery.url.edu/permalink/34CSUC_URL/1im36ta/alma991009628326406719
Table of Contents:
  • Table of Contents; Preface; Organization of This Book; Conventions Used in This Book; Safari® Books Online; Using Code Examples; How to Contact Us; Part I. General Principles of Searching For and Using Evidence; Chapter 1. The Quest for Convincing Evidence; In the Beginning; The State of Evidence Today; Challenges to the Elegance of Studies; Challenges to Statistical Strength; Challenges to Replicability of Results; Change We Can Believe In; The Effect of Context; Looking Toward the Future; References; Chapter 2. Credibility, or Why Should I Insist on Being Convinced?
  • How Evidence Turns Up in Software EngineeringCredibility and Relevance; Fitness for Purpose, or Why What Convinces You Might Not Convince Me; Quantitative Versus Qualitative Evidence: A False Dichotomy; Aggregating Evidence; Limitations and Bias; Types of Evidence and Their Strengths and Weaknesses; Controlled Experiments and Quasi-Experiments; Credibility; Relevance; Surveys; Credibility; Relevance; Experience Reports and Case Studies; Credibility; Relevance; Other Methods; Indications of Credibility (or Lack Thereof) in Reporting; General characteristics; A clear research question
  • An informative description of the study setupA meaningful and graspable data presentation; A transparent statistical analysis (if any); An honest discussion of limitations; Conclusions that are solid yet relevant; Society, Culture, Software Engineering, and You; Acknowledgments; References; Chapter 3. What We Can Learn from Systematic Reviews; An Overview of Systematic Reviews; The Strengths and Weaknesses of Systematic Reviews; The Systematic Review Process; Planning the review; Conducting the review; Reporting the review; Problems Associated with Conducting a Review
  • Systematic Reviews in Software EngineeringCost Estimation Studies; The accuracy of cost estimation models; The accuracy of cost estimates in industry; Agile Methods; Dybå and Dingsøyr; Hannay, Dybå, Arisholm, and Sjøberg; Inspection Methods; Conclusion; References; Chapter 4. Understanding Software Engineering Through Qualitative Methods; What Are Qualitative Methods?; Reading Qualitative Research; Using Qualitative Methods in Practice; Generalizing from Qualitative Results; Qualitative Methods Are Systematic; References
  • Chapter 5. Learning Through Application: The Maturing of the QIP in the SELWhat Makes Software Engineering Uniquely Hard to Research; A Realistic Approach to Empirical Research; The NASA Software Engineering Laboratory: A Vibrant Testbed for Empirical Research; The Quality Improvement Paradigm; Characterize; Set Goals; Select Process; Execute Process; Analyze; Package; Conclusion; References; Chapter 6. Personality, Intelligence, and Expertise: Impacts on Software Development; How to Recognize Good Programmers; Individual Differences: Fixed or Malleable; Personality; Intelligence
  • The Task of Programming