Algorithms and data structures : the science of computing / by Douglas Baldwin and Greg W. Scragg

By: Baidwin, DouglasContributor(s): Scragg, Greg WMaterial type: TextTextPublication details: New Delhi : Dreamtech Charles River Media, 2004cDescription: xx, 620 p. : ill. ; 24 cmISBN: 8177225456 (pbk)Other title: Algorithms & data structuresSubject(s): Computer algorithmsDDC classification: 005.1
Contents:
Part I: The Science of Computing's Three Methods of Inquiry Chapter 1: What is the Science of Computing? 1.1 Algorithms and the Science of Computing 1.1.1 Problems 1.1.2 Algorithms 1.1.3 Describing Algorithms 1.2 Computer Science's Methods of Inquiry 1.2.1 Design 1.2.2 Theory 1.2.3 Empirical Analysis 1.3 Concluding Remarks 1.4 Further Reading Chapter 2: Abstraction: An Introduction to Design 2.1 Abstraction with Objects 2.1.1 Objects 2.1.2 Messages 2.1.3 Classes 2.1.4 Objects as Abstractions 2.2 Preconditions and Postconditions 2.3 Algorithms that Produce Effects 2.3.1 Design from Pre- and Postconditions 2.3.2 Subclasses 2.3.3 Method Abstraction 2.4 Algorithms that Produce Values 2.4.1 Expressions 2.4.2 Abstractions of Expressions 2.4.3 Values and Side-Effects 2.5 Encapsulating Values 2.6 Concluding Remarks 2.7 Further Reading Chapter 3: Proof: An Introduction to Theory 3.1 Basic Ideas 3.1.1 Correctness 3.1.2 Proof 3.1.3 Proving an Algorithm Correct 3.1.4 Proof Methods 3.2 Correctness of Methods 3.2.1 An Example 3.2.2 Proof Structure 3.3 Correctness of Side-Effects 3.3.1 The Square-Drawing Problem Revisited 3.3.2 drawLine 3.3.3 drawSquare 3.4 Correctness of Classes 3.4.1 Class Invariants 3.4.2 Object Initialization and Constructors 3.4.3 Proving Class Invariants 3.5 Applying the Proof Techniques 3.5.1 The Correctness of MakeChange 3.5.2 Proofs and Incorrect Algorithms 3.6 Analyzing Efficiency 3.6.1 An Example 3.6.2 Efficiency and Problem Size 3.6.3 Execution Time 3.6.4 Resources 3.7 Concluding Remarks 3.8 Further Reading Chapter 4: Experimentation: An Introduction to the Scientific Method 4.1 Theories and Hypotheses 4.1.1 The Role of Experiments 4.2 Predictions 4.3 Experimental Procedure 4.3.1 Define Variables 4.3.2 Use Appropriate Instruments 4.3.3 Thoroughly Explore the Independent Variable 4.3.4 Control Differences between Subjects 4.3.5 Make Multiple Measurements 4.4 Experimental Error 4.4.1 Systematic Error 4.4.2 Random Error 4.5 Data. .5 Data Analysis 4.5.1 Estimating True Values 4.5.2 The Error in an Average 4.5.3 Relative Error 4.6 Forming Conclusions 4.7 Measuring Time in Computer Science Experiments 4.7.1 Defining Time as a Variable 4.7.2 Measuring Time 4.7.3 Error Sources in Time Measurements 4.8 An Example Experiment 4.9 Other Uses of Experiments in Computer Science 4.9.1 Programming 4.9.2 Hardware/Software Systems 4.9.3 Human Considerations 4.10 Concluding Remarks 4.11 Further Reading Part II: Program Design Chapter 5: Conditionals 5.1 Review of Terminology 5.2 TheTest: Boolean Expressions 5.2.1 Conjunction (And) 5.2.2 Disjunction (Or) 5.2.3 Negation (Not) 5.2.4 Implication 5.2.5 Boolean expressions and Java 5.3 Boolean Algebra 5.3.1 Complex truth tables 5.3.2 Truth tables as proof technique 5.3.3 Manipulating Boolean expressions 5.4 Correctness: Proof by Case 5.4.1 Separation of cases 5.5 Performance Analysis and Conditionals 5.5.1 Expanding the definition of execution-time 5.5.2 Bounding execution time 5.5.3 What the best case and worst case can tell us? 5.6 Boolean Algebra as a Program Design Tool 5.6.1 Robust Algorithms: the power of conditionals 5.6.2 Construction of Boolean methods 5.6.3 Interpreting nested conditionals 5.6.4 Finding design flaws 5.6.5 Establishing test data 5.7 Empirical Measure and Conditional Statements 5.8 Concluding Remarks 5.9 Further Reading Chapter 6: Designing with Recursion 6.1 Basics of Recursive Thinking 6.1.1 Stating the Problem Recursively 6.1.2 Restating in Computational Terms 6.1.3 A Recursive Java Method 6.1.4 Recursion is Ubiquitous 6.2 Controlling Recursion: Indices 6.2.1 Simple Indices 6.2.2 General Principles for Indices 6.2.3 Non-Numeric Indices 6.3 Value Producing Recursion 6.3.1 Inductive Definitions in Mathematics 6.3.2 Converting a Mathematical Definition to a Method 6.3.3 The Index as Result 6.4 Assuring Termination 6.5 Recursive Descriptions as Algorithms 6.5.1 Understanding the Execution of Recursive Algorithms 6.5.2 Classifying R. 6.5.3 Algorithms with Multiple Recursive References 6.5.4 Recursive Call is Never First 6.5.5 Hiding Recursion 6.6 Concluding Remarks 6.7 Further Reading Chapter 7: Analysis of Recursion 7.1 Correctness 7.1.1 Induction in Mathematics 7.1.2 Induction and the Correctness of Recursive Algorithms 7.2 Efficiency 7.2.1 Recurrence Relations in Mathematics 7.2.2 Recurrence Relations and Recursive Algorithms 7.3 Concluding Remarks 7.4 Further Reading Chapter 8: Creating Correct Iterative Algorithms 8.1 Loop Invariants 8.1.1 Formal Definition 8.1.2 Design with Loop Invariants 8.2 Correctness of Iterative Algorithms 8.2.1 Proving Termination 8.2.2 Proving Loop Invariants 8.2.3 Proving Postconditions 8.2.4 A Complete Correctness Proof 8.3 Concluding Remarks 8.4 Further Reading Chapter 9: Iteration and Efficiency 9.1 Counting the Steps in Iterative Algorithms 9.1.1 Overview 9.1.2 Simple Loops 9.1.3 Varying Number of Steps per Iteration 9.1.4 Deriving the Number of Iterations 9.2 Relating Step Counts to Execution Time 9.2.1 Introductory Examples 9.2.2 Asymptotic Notation 9.2.3 Testing Asymptotic Hypotheses 9.3 Iteration and Measuring Short Execution Times 9.3.1 The Basic Technique 9.3.2 Controlling for Overhead 9.3.3 Error 9.4 Concluding Remarks 9.5 Further Reading Chapter 10: A Case-Study in Design and Analysis: Efficient Sorting 10.1 An Initial Insight 10.1.1 The Insight 10.1.2 Example 10.1.3 Strategy: Divide and Conquer 10.2 Quicksort Design 10.2.1 Refining Quicksort's Design 10.2.2 Partitioning 10.3 Quicksort Performance 10.3.1 The Number of Comparisons in Partition 10.3.2 Accounting for Quicksort's Recursion 10.3.3 Quicksort's Best Case Performance 10.3.4 Quicksort's Worst-Case Performance 10.4 Mergesort 10.4.1 Overall Design 10.4.2 The Merge Algorithm 10.4.3 Performance 10.5 Concluding Remarks 10.6 Further Reading Part III: Introduction to Data Structures Chapter 11: Lists 11.1 Examples of Lists or Sequences 11.1.1 Real World Lists 11.1.2 Of Special. 11.2 Developing the Concept of a List 11.2.1 Manipulating Lists 11.2.2 Visiting the Subparts 11.2.3 A Unifying Observation 11.2.4 The Empty List 11.3 A Formal Definition of the Class List 11.3.1 Basic Methods 11.3.2 Examples 11.4 Implementing the Class 11.4.1 Constructing a Basic List Class 11.5 Making an Extendable List Class 11.5.1 Creating New Methods for Lists 11.5.2 The Final List Class 11.5.3 Ordered List as an Extension of List 11.5.4 Complex Elements 11.5.5 Case Study: Insertion Sort 11.6 Variants on the Concept of List 11.6.1 Alternate Representations of the Base Class 11.6.2 Two Way Lists 11.7 Concluding Remarks 11.8 Further Reading Chapter 12: Queues and Stacks 12.1 Queues 12.1.1 Motivation from the Real World 12.1.2 Computer Science Examples 12.1.3 The Queue Interface 12.1.4 Implementation and Efficiency 12.1.5 A Workable Definition 12.1.6 Implementing the Queue Class 12.2 Stacks 12.2.1 Real World Motivations for Stacks 12.2.2 Stacks and Computer Science 12.2.3 Designing the Stack Class 12.2.4 Implementing the Class Stack 12.2.5 Stacks and Recursion 12.2.6 Implications of Stacks for Common Computing Problems 12.3 Concluding Remarks 12.3.1 Queues vs. Stacks 12.3.2 Looking Ahead to Other Structures 12.4 Further Reading Chapter 13: Binary Trees 13.1 Hierarchical Organization 13.1.1 Real World Trees 13.1.2 Trees as Analogies 13.1.3 Trees and Computer Science 13.2 Formalizing the Concept of Tree 13.2.1 Trees as Data Structures 13.2.2 Visualizing a Formalism 13.3 Ordered Trees 13.3.1 Definition of an Ordered Tree 13.3.2 Traversal 13.3.3 Constructing an Ordered Tree 13.3.4 Searching an Ordered Tree 13.3.5 Performance Analysis for Ordered Trees 13.3.6 Trees that Are not Fully Balanced 13.3.7 Empirical Evaluation of Construction and Search Time 13.4 An OrderedTree Class for Java 13.5 Restructuring Ordered Trees 13.5.1 Deleting Nodes from Trees 13.5.2 Balancing a Tree 13.6 Arithmetic Expressions as Trees 13.7 Other Implementations 13.7.1. 13.7.1 Null Empty Trees 13.7.2 n-ary Trees 13.8 Concluding Remarks 13.9 Further Reading Chapter 14: Case Studies in Design: Abstracting Indirection 14.1 Indirection Without Formal Pointers 14.1.1 Review of Indirection in Java 14.1.2 Abstracting the Concept of Indirection 14.2 Using Arrays to Hold Dynamic Data Structures 14.2.1 Arrays as Lists 14.2.2 Circle Queue 14.2.3 Stacks 14.2.4 Array-Based Trees 14.3 Hash Coding 14.3.1 Content-Based Pointers 14.3.2 Collisions 14.3.3 Uneven Distribution 14.3.4 Dealing with Collisions 14.3.5 Hashing and Empirical Results 14.4 Priority Queues 14.4.1 The Public Interface 14.4.2 What We Already Know Might Help Us 14.4.3 Some Nice Ideas and What's Wrong with Them 14.4.4 The Heap Approach 14.4.5 One Final Detail: Finding the Oldest 14.5 Concluding Remarks 14.6 Further Reading Part III: The Limits of Computer Science Chapter 15: Exponential Growth 15.1 Warm-up: The Towers of Hanoi 15.1.1 The Problem 15.1.2 The Algorithm 15.1.3 Execution Time 15.2 Drawing Trees 15.2.1 The Algorithm 15.2.2 The Execution Time of the Tree-Drawing Algorithm 15.2.3 A Variation on the Algorithm 15.3 The Empirical Significance of Exponential Time 15.4 Reducing Exponential Costs 15.4.1 The Fibonacci Numbers 15.4.2 An Obvious Fibonacci Number Algorithm 15.4.3 The Dynamic Programming Algorithm 15.5 Concluding Remarks 15.6 Further Reading Chapter 16: Limits to Performance 16.1 Preliminary Remarks 16.2 Finding Lower Bounds and Proving Optimality 16.2.1 The Towers of Hanoi 16.2.2 Other Problems 16.3 Open Questions 16.3.1 Tractable and Intractable Problems 16.3.2 Factoring and Cryptography 16.3.3 The Travelling Sales Representative and NP-Hard Problems 16.4 Concluding Remarks 16.5 Further Reading Chapter 17: The Halting Problem 17.1 Applying Computer Science to Computer Science 17.1.1.
Summary: Annotation
Tags from this library: No tags from this library for this title. Log in to add tags.
Star ratings
    Average rating: 0.0 (0 votes)
Holdings
Item type Current library Call number Status Date due Barcode Item holds
Reference Namal Library
Reference
005.1 BAL-A 2004 1285 (Browse shelf (Opens below)) Not for loan 1285
Total holds: 0

Index included

Part I: The Science of Computing's Three Methods of Inquiry Chapter 1: What is the Science of Computing? 1.1 Algorithms and the Science of Computing 1.1.1 Problems 1.1.2 Algorithms 1.1.3 Describing Algorithms 1.2 Computer Science's Methods of Inquiry 1.2.1 Design 1.2.2 Theory 1.2.3 Empirical Analysis 1.3 Concluding Remarks 1.4 Further Reading Chapter 2: Abstraction: An Introduction to Design 2.1 Abstraction with Objects 2.1.1 Objects 2.1.2 Messages 2.1.3 Classes 2.1.4 Objects as Abstractions 2.2 Preconditions and Postconditions 2.3 Algorithms that Produce Effects 2.3.1 Design from Pre- and Postconditions 2.3.2 Subclasses 2.3.3 Method Abstraction 2.4 Algorithms that Produce Values 2.4.1 Expressions 2.4.2 Abstractions of Expressions 2.4.3 Values and Side-Effects 2.5 Encapsulating Values 2.6 Concluding Remarks 2.7 Further Reading Chapter 3: Proof: An Introduction to Theory 3.1 Basic Ideas 3.1.1 Correctness 3.1.2 Proof 3.1.3 Proving an Algorithm Correct 3.1.4 Proof Methods 3.2 Correctness of Methods 3.2.1 An Example 3.2.2 Proof Structure 3.3 Correctness of Side-Effects 3.3.1 The Square-Drawing Problem Revisited 3.3.2 drawLine 3.3.3 drawSquare 3.4 Correctness of Classes 3.4.1 Class Invariants 3.4.2 Object Initialization and Constructors 3.4.3 Proving Class Invariants 3.5 Applying the Proof Techniques 3.5.1 The Correctness of MakeChange 3.5.2 Proofs and Incorrect Algorithms 3.6 Analyzing Efficiency 3.6.1 An Example 3.6.2 Efficiency and Problem Size 3.6.3 Execution Time 3.6.4 Resources 3.7 Concluding Remarks 3.8 Further Reading Chapter 4: Experimentation: An Introduction to the Scientific Method 4.1 Theories and Hypotheses 4.1.1 The Role of Experiments 4.2 Predictions 4.3 Experimental Procedure 4.3.1 Define Variables 4.3.2 Use Appropriate Instruments 4.3.3 Thoroughly Explore the Independent Variable 4.3.4 Control Differences between Subjects 4.3.5 Make Multiple Measurements 4.4 Experimental Error 4.4.1 Systematic Error 4.4.2 Random Error 4.5 Data. .5 Data Analysis 4.5.1 Estimating True Values 4.5.2 The Error in an Average 4.5.3 Relative Error 4.6 Forming Conclusions 4.7 Measuring Time in Computer Science Experiments 4.7.1 Defining Time as a Variable 4.7.2 Measuring Time 4.7.3 Error Sources in Time Measurements 4.8 An Example Experiment 4.9 Other Uses of Experiments in Computer Science 4.9.1 Programming 4.9.2 Hardware/Software Systems 4.9.3 Human Considerations 4.10 Concluding Remarks 4.11 Further Reading Part II: Program Design Chapter 5: Conditionals 5.1 Review of Terminology 5.2 TheTest: Boolean Expressions 5.2.1 Conjunction (And) 5.2.2 Disjunction (Or) 5.2.3 Negation (Not) 5.2.4 Implication 5.2.5 Boolean expressions and Java 5.3 Boolean Algebra 5.3.1 Complex truth tables 5.3.2 Truth tables as proof technique 5.3.3 Manipulating Boolean expressions 5.4 Correctness: Proof by Case 5.4.1 Separation of cases 5.5 Performance Analysis and Conditionals 5.5.1 Expanding the definition of execution-time 5.5.2 Bounding execution time 5.5.3 What the best case and worst case can tell us? 5.6 Boolean Algebra as a Program Design Tool 5.6.1 Robust Algorithms: the power of conditionals 5.6.2 Construction of Boolean methods 5.6.3 Interpreting nested conditionals 5.6.4 Finding design flaws 5.6.5 Establishing test data 5.7 Empirical Measure and Conditional Statements 5.8 Concluding Remarks 5.9 Further Reading Chapter 6: Designing with Recursion 6.1 Basics of Recursive Thinking 6.1.1 Stating the Problem Recursively 6.1.2 Restating in Computational Terms 6.1.3 A Recursive Java Method 6.1.4 Recursion is Ubiquitous 6.2 Controlling Recursion: Indices 6.2.1 Simple Indices 6.2.2 General Principles for Indices 6.2.3 Non-Numeric Indices 6.3 Value Producing Recursion 6.3.1 Inductive Definitions in Mathematics 6.3.2 Converting a Mathematical Definition to a Method 6.3.3 The Index as Result 6.4 Assuring Termination 6.5 Recursive Descriptions as Algorithms 6.5.1 Understanding the Execution of Recursive Algorithms 6.5.2 Classifying R. 6.5.3 Algorithms with Multiple Recursive References 6.5.4 Recursive Call is Never First 6.5.5 Hiding Recursion 6.6 Concluding Remarks 6.7 Further Reading Chapter 7: Analysis of Recursion 7.1 Correctness 7.1.1 Induction in Mathematics 7.1.2 Induction and the Correctness of Recursive Algorithms 7.2 Efficiency 7.2.1 Recurrence Relations in Mathematics 7.2.2 Recurrence Relations and Recursive Algorithms 7.3 Concluding Remarks 7.4 Further Reading Chapter 8: Creating Correct Iterative Algorithms 8.1 Loop Invariants 8.1.1 Formal Definition 8.1.2 Design with Loop Invariants 8.2 Correctness of Iterative Algorithms 8.2.1 Proving Termination 8.2.2 Proving Loop Invariants 8.2.3 Proving Postconditions 8.2.4 A Complete Correctness Proof 8.3 Concluding Remarks 8.4 Further Reading Chapter 9: Iteration and Efficiency 9.1 Counting the Steps in Iterative Algorithms 9.1.1 Overview 9.1.2 Simple Loops 9.1.3 Varying Number of Steps per Iteration 9.1.4 Deriving the Number of Iterations 9.2 Relating Step Counts to Execution Time 9.2.1 Introductory Examples 9.2.2 Asymptotic Notation 9.2.3 Testing Asymptotic Hypotheses 9.3 Iteration and Measuring Short Execution Times 9.3.1 The Basic Technique 9.3.2 Controlling for Overhead 9.3.3 Error 9.4 Concluding Remarks 9.5 Further Reading Chapter 10: A Case-Study in Design and Analysis: Efficient Sorting 10.1 An Initial Insight 10.1.1 The Insight 10.1.2 Example 10.1.3 Strategy: Divide and Conquer 10.2 Quicksort Design 10.2.1 Refining Quicksort's Design 10.2.2 Partitioning 10.3 Quicksort Performance 10.3.1 The Number of Comparisons in Partition 10.3.2 Accounting for Quicksort's Recursion 10.3.3 Quicksort's Best Case Performance 10.3.4 Quicksort's Worst-Case Performance 10.4 Mergesort 10.4.1 Overall Design 10.4.2 The Merge Algorithm 10.4.3 Performance 10.5 Concluding Remarks 10.6 Further Reading Part III: Introduction to Data Structures Chapter 11: Lists 11.1 Examples of Lists or Sequences 11.1.1 Real World Lists 11.1.2 Of Special. 11.2 Developing the Concept of a List 11.2.1 Manipulating Lists 11.2.2 Visiting the Subparts 11.2.3 A Unifying Observation 11.2.4 The Empty List 11.3 A Formal Definition of the Class List 11.3.1 Basic Methods 11.3.2 Examples 11.4 Implementing the Class 11.4.1 Constructing a Basic List Class 11.5 Making an Extendable List Class 11.5.1 Creating New Methods for Lists 11.5.2 The Final List Class 11.5.3 Ordered List as an Extension of List 11.5.4 Complex Elements 11.5.5 Case Study: Insertion Sort 11.6 Variants on the Concept of List 11.6.1 Alternate Representations of the Base Class 11.6.2 Two Way Lists 11.7 Concluding Remarks 11.8 Further Reading Chapter 12: Queues and Stacks 12.1 Queues 12.1.1 Motivation from the Real World 12.1.2 Computer Science Examples 12.1.3 The Queue Interface 12.1.4 Implementation and Efficiency 12.1.5 A Workable Definition 12.1.6 Implementing the Queue Class 12.2 Stacks 12.2.1 Real World Motivations for Stacks 12.2.2 Stacks and Computer Science 12.2.3 Designing the Stack Class 12.2.4 Implementing the Class Stack 12.2.5 Stacks and Recursion 12.2.6 Implications of Stacks for Common Computing Problems 12.3 Concluding Remarks 12.3.1 Queues vs. Stacks 12.3.2 Looking Ahead to Other Structures 12.4 Further Reading Chapter 13: Binary Trees 13.1 Hierarchical Organization 13.1.1 Real World Trees 13.1.2 Trees as Analogies 13.1.3 Trees and Computer Science 13.2 Formalizing the Concept of Tree 13.2.1 Trees as Data Structures 13.2.2 Visualizing a Formalism 13.3 Ordered Trees 13.3.1 Definition of an Ordered Tree 13.3.2 Traversal 13.3.3 Constructing an Ordered Tree 13.3.4 Searching an Ordered Tree 13.3.5 Performance Analysis for Ordered Trees 13.3.6 Trees that Are not Fully Balanced 13.3.7 Empirical Evaluation of Construction and Search Time 13.4 An OrderedTree Class for Java 13.5 Restructuring Ordered Trees 13.5.1 Deleting Nodes from Trees 13.5.2 Balancing a Tree 13.6 Arithmetic Expressions as Trees 13.7 Other Implementations 13.7.1. 13.7.1 Null Empty Trees 13.7.2 n-ary Trees 13.8 Concluding Remarks 13.9 Further Reading Chapter 14: Case Studies in Design: Abstracting Indirection 14.1 Indirection Without Formal Pointers 14.1.1 Review of Indirection in Java 14.1.2 Abstracting the Concept of Indirection 14.2 Using Arrays to Hold Dynamic Data Structures 14.2.1 Arrays as Lists 14.2.2 Circle Queue 14.2.3 Stacks 14.2.4 Array-Based Trees 14.3 Hash Coding 14.3.1 Content-Based Pointers 14.3.2 Collisions 14.3.3 Uneven Distribution 14.3.4 Dealing with Collisions 14.3.5 Hashing and Empirical Results 14.4 Priority Queues 14.4.1 The Public Interface 14.4.2 What We Already Know Might Help Us 14.4.3 Some Nice Ideas and What's Wrong with Them 14.4.4 The Heap Approach 14.4.5 One Final Detail: Finding the Oldest 14.5 Concluding Remarks 14.6 Further Reading Part III: The Limits of Computer Science Chapter 15: Exponential Growth 15.1 Warm-up: The Towers of Hanoi 15.1.1 The Problem 15.1.2 The Algorithm 15.1.3 Execution Time 15.2 Drawing Trees 15.2.1 The Algorithm 15.2.2 The Execution Time of the Tree-Drawing Algorithm 15.2.3 A Variation on the Algorithm 15.3 The Empirical Significance of Exponential Time 15.4 Reducing Exponential Costs 15.4.1 The Fibonacci Numbers 15.4.2 An Obvious Fibonacci Number Algorithm 15.4.3 The Dynamic Programming Algorithm 15.5 Concluding Remarks 15.6 Further Reading Chapter 16: Limits to Performance 16.1 Preliminary Remarks 16.2 Finding Lower Bounds and Proving Optimality 16.2.1 The Towers of Hanoi 16.2.2 Other Problems 16.3 Open Questions 16.3.1 Tractable and Intractable Problems 16.3.2 Factoring and Cryptography 16.3.3 The Travelling Sales Representative and NP-Hard Problems 16.4 Concluding Remarks 16.5 Further Reading Chapter 17: The Halting Problem 17.1 Applying Computer Science to Computer Science 17.1.1.

Annotation

There are no comments on this title.

to post a comment.