✒️
Computer Science Principles
  • Introduction
  • Overview
  • Course at a Glance
  • Course Exam Description
  • Create Performance Task
  • Reference Sheet
  • Resources
  • Big Idea 1
    • 1.1 Collaboration
    • 1.2 Program Function and Purpose
    • 1.3 Program Design and Development
    • 1.4 Identifying and Correcting Errors
  • Big Idea 2
    • 2.1 Binary Numbers
    • 2.2 Data Compression
    • 2.3 Extracting Information from Data
    • 2.4 Using Programs with Data
  • Big Idea 3
    • 3.1 Variables and Assignments
    • 3.2 Data Abstraction
    • 3.3 Mathematical Expressions
    • 3.4 Strings
    • 3.5 Boolean Expression
    • 3.6 Conditionals
    • 3.7 Nested Conditionals
    • 3.8 Iteration
    • 3.9 Developing Algorithms
    • 3.10 Lists
    • 3.11 Binary Search
    • 3.12 Calling Procedures
    • 3.13 Developing Procedures
    • 3.14 Libraries
    • 3.15 Random Values
    • 3.16 Simulations
    • 3.17 Algorithmic Efficiency
    • 3.18 Undecidable Problems
  • Big Idea 4
    • 4.1 The Internet
    • 4.2 Fault Tolerant
    • 4.3 Parallel and Distributed Computing
  • Big Idea 5
    • 5.1 Beneficial and Harmful Effects
    • 5.2 Digital Divide
    • 5.3 Computing Bias
    • 5.4 Crowdsourcing
    • 5.5 Legal and Ethical Concerns
    • 5.6 Safe Computing
  • Code
    • Week 10
    • Week 11
    • Week 12
    • Week 13
    • Week 14
    • Week 15
    • Week 16
    • Week 17
    • Week 18
    • Week 19
    • Week 20
    • Week 21
    • Week 22
Powered by GitBook
On this page
  • Enduring Understanding
  • Learning Objective
  • Essential Knowledge

Was this helpful?

Export as PDF
  1. Big Idea 2

2.2 Data Compression

Enduring Understanding

The way a computer represents data internally is different from the way the data are interpreted and displayed for the user. Programs are used to translate data into a representation more easily understood by people.

Learning Objective

Compare data compression algorithms to determine which is best in a particular context. 

Essential Knowledge

Data compression can reduce the size (number of bits) of transmitted or stored data.

Fewer bits does not necessarily mean less information.

The amount of size reduction from compression depends on both the amount of redundancy in the original data representation and the compression algorithm applied.

Lossless data compression algorithms can usually reduce the number of bits stored or transmitted while guaranteeing complete reconstruction of the original data.

Lossy data compression algorithms can significantly reduce the number of bits stored or transmitted but only allow reconstruction of an approximation of the original data.

Lossy data compression algorithms can usually reduce the number of bits stored or transmitted more than lossless compression algorithms.

In situations where quality or ability to reconstruct the original is maximally important, lossless compression algorithms are typically chosen.

In situations where minimizing data size or transmission time is maximally important, lossy compression algorithms are typically chosen.

Previous2.1 Binary NumbersNext2.3 Extracting Information from Data

Last updated 1 year ago

Was this helpful?