Topic 2 :Data and Data Representation

Get Started. It's Free
or sign up with your email address
Rocket clouds
Topic 2 :Data and Data Representation by Mind Map: Topic 2 :Data and Data Representation

1. Data, Information and Processing

1.1. Data Formats

1.1.1. Computers Process and store all forms of data in binary format

1.1.2. Human communication Includes language, images and sounds

1.1.3. Data formats Specifications for converting data into computer-usable form Define the different ways human data may be represented, stored and processed by a computer

1.2. Information processing cycle

1.2.1. is the series of input, process, output, and storage activities

1.2.2. Collects data (input)>Processing>Produces information (output)

1.3. Data, Information, Knowledge

1.3.1. Data>Information>Knowledge

1.3.2. Data unprocessed facts and figures is a collection of unprocessed items, which can include text, numbers, images, audio, and video.

1.3.3. Information data that has been interpreted conveys meaning and is useful to people.

1.3.4. Knowledge information, experience and insight

1.4. Processing – Data Coding

1.4.1. Data is encoded by assigning a bit pattern to each character, digit, or multimedia object.

1.4.2. Many standards exist for encoding Character encoding like ASCII Image encodings like JPEG Video encodings like MPEG-4

1.4.3. Examples of Standards Alphanumeric ASCII, EBCDIC, Unicode Image JPEG, GIF, PCX, TIFF Motion picture MPEG-2, Quick Time Sound Sound Blaster, WAV, AU Outline graphics/fonts PostScript, TrueType, PDF

1.4.4. Processing - Data Storage and Compression Reduce the size of data to save space or transmission time Categories of data compression: Lossless Lossy

1.5. Processing – Data Integrity

1.5.1. Security or protection of data

1.5.2. Involves access to files Access Control Lists (ACLs)

1.5.3. Protect files from being read, written to, or executed Password protection Keyboard locking

1.5.4. Data Integrity = Quality of Data Correctness Completeness Validity Compliance Consistency

2. Bits, Bytes, and Words

2.1. Bits

2.1.1. The basic unit of information in computing and telecommunication

2.1.2. These two values are often interpreted as binary digits and are usually denoted by 0 and 1

2.2. Bytes

2.2.1. a unit of digital information in computing and telecommunications

2.2.2. number of bits used to encode a single character of text in a computer

2.3. Words

2.3.1. a term for the natural unit of data used by a particular computer design

2.3.2. fixed sized group of bits that are handled together by the system

2.3.3. an important characteristic of computer architecture.

2.4. When referring to binary, octal, decimal, hexadecimal, a single lowercase letter appended to the end of each number to identify its type.

2.5. Numbering System

2.5.1. Radix Example hexadecimal 45 will be written as 45h octal 76 will be written as 76o or 76q binary 11010011 will be written as 11010011b

2.5.2. Base The number of different symbols required to represent any given number The larger the base, the more numerals are required Base 2 (Binary) Base 8 (Octal) Base 10 (Decimal) Base 16 (Hexadecimal) For a given number, the larger the base the more symbols required but the fewer digits needed

2.6. Binary System

2.6.1. Early computer design was decimal

2.6.2. Mark I and ENIAC

2.6.3. John von Neumann proposed binary data processing (1945)

2.6.4. Used for both instructions and data

2.6.5. Natural relationship between on/off switches and calculation using Boolean logic Binary is a base 2 numbering system each digit is either a 0 (off) or a 1 (on)

2.7. Octal System

2.7.1. As known as base 8 numbering system

2.7.2. There are only eight different digits available (0, 1, 2, 3, 4, 5, 6, 7)

2.8. Decimal System

2.8.1. Decimal is a base 10 numbering system

2.8.2. Each digit in the number is multiplied by 10 raised to a power corresponding to that digit position.

2.9. Hexadecimal System

2.9.1. Hexadecimal is a base 16 numbering system

2.9.2. Used not only to represent integers

2.9.3. Also used to represent sequence of binary digits

3. ASCII Codes, Unicode

3.1. The Alphanumeric Representation

3.1.1. The data entered as characters, number digits, and punctuation are known as alphanumeric data

3.1.2. 3 alphanumeric codes are in common use ASCII (American Standard Code for Information Interchange) Unicode EBCDIC (Extended Binary Coded Decimal Interchange Code).

3.2. ASCII

3.2.1. to provide a standard to code various symbols ( visible and invisible symbols)

3.2.2. each binary value between 0 and 127 represents a specific character.

3.2.3. Most computers extend the ASCII character set to use the full range of 256 characters available in a byte

3.2.4. The upper 128 characters handle special things like accented characters from common foreign languages.

3.2.5. ASCII works by assigning standard numeric values to letters, numbers, punctuation marks and other characters such as control codes.

3.2.6. Keyboard Input Key (“scan”) codes are converted to ASCII ASCII code sent to host computer Received by the host as a “stream” of data Stored in buffer and being processed

3.3. Unicode

3.3.1. A worldwide character-encoding standard

3.3.2. ts main objective is to enable a single, unique character set that is capable of supporting all characters from all scripts

3.3.3. 16-bit standard

3.3.4. It is a superset of ASCII

3.3.5. Usage of Unicode Encode text for creation of passwords Encodes characters to display in all webpages Encode characters used in email settings Modify characters used in documents

3.4. ASCII vs Unicode

3.4.1. ASCII Has 128 code points, 0 through 127 Can only encode characters in 7 bits Can only encode characters from the English language

3.4.2. Unicode Has about 1,114,112 code positions Can encode characters in 16-bits and more Can encode characters from virtually all kinds of languages