Top Banner
A SUMMER TRAINING REPORT ON BROADCASTING & STUDIO SETUP IN “DOORDARSHAN KENDRA” (JAIPUR) Submitted for partial fulfilment for the award of the degree of the Bachelor Of Technology Rajasthan Technical University, Kota (2014-2015) SUBMITTED TO: SUBMITTED BY: Mr. Mukesh Arora Prakhar Gupta Reader, ECE branch BTECH, 4th year (7SEM), ECE Department of Electronics and Communication Engineering SWAMI KESHVANAND INSTITUTE OF TECHNOLOGY MANAGEMENT & GRAMOTHAN, JAGATPURA JAIPUR
39
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: My training report (pdf)

A

SUMMER TRAINING REPORT

ON

BROADCASTING & STUDIO SETUP

IN

“DOORDARSHAN KENDRA”

(JAIPUR)

Submitted for partial fulfilment for the award of the degree of the

Bachelor Of Technology

Rajasthan Technical University, Kota

(2014-2015)

SUBMITTED TO: SUBMITTED BY:

Mr. Mukesh Arora Prakhar Gupta

Reader, ECE branch BTECH, 4th year (7SEM), ECE

Department of Electronics and Communication Engineering

SWAMI KESHVANAND INSTITUTE OF TECHNOLOGY

MANAGEMENT & GRAMOTHAN, JAGATPURA

JAIPUR

Page 2: My training report (pdf)

ACKNOWLEDGEMENT

Training is one of the important aspects for an engineering student’s carrier. It is

basically to strengthen the practical concepts. During this training student gets

aquainted with the latest technology and recent development.

Firstly, I convey my sincere thanks to all the employees of DDK, JAIPUR. Their love

and guidance are omnipotent and incompatible throughout the training period. I convey

special thanks to A.K.Tambi for providing me the opportunity to undergo this training

and I also express thanks to all hard members for their help and co operation.

I also want to thank to Mr.Mukesh Arora (Reader,EC) for allowing me to do my

summer training from DDK jaipur.

- Prakhar Gupta

Page 3: My training report (pdf)

i

TABLE OF CONTENT

CHAPTER NO. TOPIC PAGE NO.

CERTIFICATE i

ACKNOWELEDGEMENT ii

LIST OF CONTENTS iii-v

LIST OF FIGURES vi-vii

CHAPTER 1 INTRODUCTION TO BROADCASTING 1

1.1 Organizational setup 3

1.2 Technical details 4

CHAPTER 2 STUDIO 4

2.1 Main job in studio 4

2.2 Equipments in studio 5

2.3 Equipments in PCR 5

2.4 Equipments in MSR 6

2.5 Equipments in VTR 6

2.6 News section 6

2.7 OB van 7

2.8 DSNG van 7

Page 4: My training report (pdf)

ii

CHAPTER 3 Television camera 7

3.1 Live camera 8

3.2 CCD sensor 9

CHAPTER 4 Video production switcher & digital video effect 12

4.1 Basic function of switcher 14

4.1.1 Method of transitions 14

4.1.2 Mixing 15

4.1.3 Wiping 16

4.1.4 Keying 16

4.2 Key terms 19

4.2.1 Chroma keying 19

4.3 Configuration of production switcher 19

4.4 Digital video effect 21

CHAPTER 5 Satellite communication 24

CHAPTER 6 Microwave link, optical fibre link & measurement 25

6.1 Links for OB coverage system 25-26

6.2 Wave propagation 26

6.3 Antennas 27-30

Page 5: My training report (pdf)

iii

CONCLUSION viii

REFERENCE ix

Page 6: My training report (pdf)

iv

LIST OF FIGURES

Figure No. Figure Description Page No.

1.1 Organizational setup 3

2.1 Studio setup 4

2.2 Equipments in PCR 5

2.3 Block diagram of News section 6

2.4 Block diagram of OB van 7

2.5 Block diagram of DSNG van 7

3.1 Graph of knee function 10

3.2 Graph of Gamma & gamma correction 11

3.3 Transfer function 12

4.1 Vision mixer 13

4.2 Keying 17

4.3 Key filling 18

4.4 Chroma keying 19

4.5 Light effects 23

5.1 Satellite communication 24

6.1 Simplified microwave link 27

Page 7: My training report (pdf)

v

6.2 Microwave transmitter link 28

6.3 Microwave receiver link 29

6.4 Down converter 30

Page 8: My training report (pdf)

1

1. Introduction

What is broadcasting?

The process of sending information to a distant place is called broadcasting. It is a one

way communication with interaction. We can send an information to distant place

with good quality of intelligence/signal.

Doordarshan aims to provide information, education and entertainment for the public.

History of broadcast

Pre-independence

June, 1923

Broadcast of programmes by the radio club of Bombay.

November, 1923

Calcutta radio club puts out programmes.

16th may, 1924

Madras presidency radio club founded.

31st July, 1924

Broadcasting service initiated by the club.

23rd July, 1927

Indian broadcasting company (IBC), Bombay station inaugurated by Lord Irwin--The

Viceroy of India.

26th August, 1927

Calcutta station of IBC inaugurated.

1st April, 1930

Indian state broadcasting service under department of industries and labor commences

on experimental basis.

Page 9: My training report (pdf)

2

March, 1935

Controller of broadcasts constituted a new department.

August 30, 1935

Controller of broadcasting in Lionel Fielden appointed the first India.

1st April, 1930

Indian state broadcasting service under department of industries and labour

commences on experimental basis.

March, 1935

A new department controller of broadcasts constituted.

August 30, 1935

Lionel Fielden appointed the first controller of broadcasting in India.

June 8, 1936

Indian state broadcasting service becomes all India radio and in the same year Delhi

station was formed.

November, 1937

Air comes under department of communication.

1st October, 1939

External services started with push to broadcast.

October 24, 1941

AIR comes under department of I & B.

23rd February, 1946

AIR comes under the department of information and arts.

Page 10: My training report (pdf)

3

Facts as on 15th august 1947

When India attained Independence in 1947, AIR had a network of six stations and a

complement of 18 transmitters. The coverage was 2.5% of the area and just 11% of

the population. Rapid expansion of the network took place post-independence.

On 12th November 1947 the voice of Mahatma Gandhi was broadcast in air and since

then it is celebrated as broadcasting day. Television (Doordarshan) started in India in

the year 1959 with black and white transmission. The black & white converted fully

into color in 1982 during asian games.

1.1 ORGANIZATIONAL SETUP

Figure 1.1: Organizational Setup

An overview of DDK Jaipur

• Started in 1977 as UDK

• Studios came in 1987

• 24x7 launched on 16th aug-13

MINISTRY OFI & B

PRASAR BHARTI

AKASHVANI

PROGRAMME NEWS ENGINEERING

DOORDARSHAN

PROGRAMME ENGINEERINGNEWS

Page 11: My training report (pdf)

4

1.2 Technical details

STUDIO:

• Recording studio-400sqm

• Transmission studio-50sqm

• 7 camera OB van

• Ku band DSNG van/digital e/s

2. STUDIO SETUP

Figure 2.1: Studio setup

2.1 Main jobs in studio

• Program Production.

• Post Production.

Studio IFloor PCR I

MSR

VTR

EXTERNALSIGNALS

Earth STATION

TRANSMITTER

Studio IIFloor

PCR II

OFC

STL

From ENPS

PS AC

Page 12: My training report (pdf)

5

• Transmission

2.2 Equipments in studio

• Program Sets

• Lights

• Cameras

• Microphones

• Communication

2.3 Equipments in PCR

• Production panel

• CCU

• Vision mixer

• Audio console

• Lighting panel

• Communication

Figure 2.2: Equipments in PCR

Page 13: My training report (pdf)

6

2.4 Equipments in MSR

• Cameras hardware

• Vision Mixer hardware

• Sync Pulse generator

• Video/Audio distributors

• S.T.L transmission unit

• OFC transmission unit

• IRD’s

2.5 Equipments in VTR

• video recording machines

• servers

2.6 News section

FIG 2.3 News section

Page 14: My training report (pdf)

7

2.7 OB Van

It is a Multi camera setup used for big recordings/live coverage’s

Figure 2.4: OB van

2.8 DSNG Van

Digital satellite news gathering van employed for news coverage's. It has an up

linking system to uplink the signal to the satellite

Figure: 2.5 DSNG van

3. Television camera

3.1 Live cameras

Studio Camera

OB camera

ENG camera

Live camera generates video signals from the optical images of indoor and outdoor

scenes, which may be under the controlled conditions of a studio (Studio camera), a

Page 15: My training report (pdf)

8

sport venue (OB camera), or at any location in the field, such as news event, or in the

home (ENG camera)

Studio cameras are designed to provide optimum performance in a controlled indoor

environment, usually at the expense of size, weight, and portability. These cameras

have all the features necessary to operate in a system with other cameras and

recorders. The basic functions of portable and studio cameras are similar, but the

portable camera must have the ability to control its own operating parameters such as

gain and sensitivity control automatically, because the camera user cannot be required

to make any technical adjustments

Studio Cameras

studio cameras are larger and locating some of the equipments remote from the

camera head

A larger camera is mechanically more stable and, with the proper mounting

equipments, it can be moved more smoothly than a small light weight camera.

A larger camera can support larger lenses which provide faster optical speed and

greater zoom ranges. This is especially important for sporting event pickup

Studio cameras are always connected into a system, which requires a large

number of connections to support synchronization, video and audio going both

from and to the camera, control, intercom and power. Now a day’s studio cameras

generally use atria cable, with all signals multiplexed on it.

In studio setting. It is desirable for the camera operator to devote all his or her

matters or adjustments should be handled elsewhere. Thus studio cameras usually

have a remote control position where a technically skilled operator can oversee the

camera’s technical operations, and do whatever is required to optimize the picture

performance and quality

A TV camera consists of three sections

A Camera lens and optics: To form optical image on the face plate of a pickup

device. A transducer or pick up device: To convert optical image into an electrical

signal.

Electronics: To process output of a transducer to get a CCVS signal.

Page 16: My training report (pdf)

9

3.2 CCD sensor

A CCD sensor is a pixel sized solid state photosensitive element that generates and

stores an electric charge when it is illuminated. It is a building block for a CCD

imager, which is a rectangular array of sensors upon which an image of the scene is

focused. In most configurations, the sensor includes the circuitry that stores and

transfers its charge to a shift register, which converts the spatial array of charges in

the CCD imager into a time- varying video output current.

Functions performed by CCDs:

1. Photo sensing

2. Sampling

3. Charge storage

4. Charge transfer

CCD imagers

CCD Imagers consists of a matrix of individual sensors, one sensor per image pixel

(three for tri color CCD), mounted in a rectangular array on a silicon substrate. They

develop electric charges that are proportional to their illumination by an image of the

scene focused on the array. The charges are stored temporarily in the potential “wells”

of the sensors and are transferred to shift registers during the field blanking interval,

which then transmit them to the imager output in the proper time sequence to generate

the video output signal. The purpose of CCD transfer process is to read out the pixels’

stored charges sequentially to a video output

Three different types of CCD chips are used as a pick up device for

cameras

1. Interline transfer type (IT)

2. Frame transfer type (FT)

3. Frame Interline transfer type (FIT)

4. Knee function (highlight control)

Page 17: My training report (pdf)

10

Highlight is name given to an area of the scene where the light level goes above

the level chosen for reference white. This can come from light bulbs, car

headlights at night, reflections, or simply a bright sky creeping into a scene that is

otherwise

Figure 3.1: Knee function

Gamma & gamma correction

In TV system the transfer function between light values and signal voltages at picture

tube is not linear. A receiver picture tube does not emit light in direct proportion to

the signal voltage applied. This nonlinear relationship between light output (L) and

signal voltage (V) may be expressed as

L (V) γ

The exponent of the transfer function is called gamma (). It can specify the slope of

the transfer curve at any point.to compensate for this non linearity at the picture tube,

an opposite distortion, referred to as Gamma correction is introduced at the picture

source. If E is the camera voltage resulting from a given light input (Li), then a

Page 18: My training report (pdf)

11

gamma corrected voltage of (E) 1/ will yield a light output at the receiver which is

directly proportional to light input at camera.

LO (V) γ [(E) 1/] γ E Li

LO Li

Or Light output Light input at pickup device. (It is presumed that the camera

voltage E has been arranged linearly, related to the light input at the camera).

Gamma correction is usually applied in each of three color channel in our TV system.

If Gamma is less than unity (used in pickup devices), whites are compressed (crushed)

and blacks are expanded (stretched).If the system is linear the gamma of the system is

unity in black & white TV, effect of high overall gamma is to stretch the white and

compress the black, but in color TV, it causes serious errors in the hue.

A Gamma of more than unity (typically 1.2) is preferred to compensate for the loss of

contrast due to optical flare etc.

Figure: 3.2 Gamma & Gamma correction

The exponent of the transfer function is called Gamma. It can specify the slop of the

Curve at any point

Page 19: My training report (pdf)

12

Transfer function of pick up device

Figure 3.3: Transfer function

4. Video production switcher and digital video effect

4.1 Vision mixer or production switcher

Television program. Production switchers are essential to all live operations.

Production switcher is used to either switch or cut between two video sources, or to

combine them in a variety of ways. The principal methods used to combine video are

Mixing, Wiping, & Keying.Vision mixer and video mixer are almost exclusively

European terms to describe both the equipment and operators.

In the United States, the common name for a device of this kind is (video) production

switcher and the common name for the operator of the device is known as technical

director

Page 20: My training report (pdf)

13

Figure 4.1: To show vision mixer

Page 21: My training report (pdf)

14

4.2 Basic functions of a switcher

Cut operation or switching between two sources.

Mixing or dissolve.

Wiping.

Keying.

4.2.1 Methods of transitions

Cut

Mix or dissolve

Wipe

Fad

DVE

Cut operation

It allows having changeover from one video source to another source

instantaneously. Switching between one sources to another is done during vertical

interval period to make the changeover invisible on the screen.

Type of mixing

Additive mixing

Non additive mixing

Additive mixing

Page 22: My training report (pdf)

15

Non additive mixing

4.2.2 Mixing

Mixing is an additive combination of two video sources. Two input sources are mixed

in proportion in a summing amplifier as decided by the position of the control fader.

Two extreme positions give either of the sources at output. Middle of the two fader

gives mixed output of the two sources. The sum of two video sources is always 1 volt

p-p. Control to the summing amplifier is derived from the fader.

Page 23: My training report (pdf)

16

4.2.3 Wiping

This is a non-additive combination of two sources. A switching occurs during the

active video at specific points on the raster to produces a pattern, between two video

sources. Switching is controlled by an internal wipe pattern generator which can

provide the waveform either saw tooth or parabola at H, V or both H and V rates. The

very simple to very complex waveform can be generated by the wipe pattern

generator. Simple patterns can be generated by logical combination of different

waveforms however the production of more complex patterns is better handled by a

microprocessor

4.2.4 Keying

This is also a non-additive combination of two video sources. In keying

operation, part of the picture (Background) is replaced by another picture

(foreground) according to the limits set by a third signal called key or alpha

signal which can be derived either from the picture itself being keyed (self- key

or internal key), or from a third video signal from camera or CG or DVE etc.

(Auto select or external key).

Keying signal provides a switching square waveform to switch between

foreground and background sources during active video. This keying signal can

be generated either by the luminance, hue or chrominance of the source input.

The keyed portion can be filled with the same source (self- key) or with the color

(Matt key) or external source (split key)

4.3 Key terms

4.3.1 Luminance key

4.3.2 Matt key

4.3.3 Self key

4.3.4 Split key

Page 24: My training report (pdf)

17

Figure 4.2: Keying

There are three elements to every key:

1. Background (from background bus).

2. Key cut (Key hole cutter made by any of the four types: Luminance, Linear,

Chrome, Preset pattern)

3. Key fill (From key fill bus or matt fill

Page 25: My training report (pdf)

18

Figure 4.3: Key filling

Rules for making key cuts:

Luminance key

Black or white off or on.

Cuts a hole where black meets white.

Key is either on or off.

Linear key

Chrome key

Cuts a hole from hue of a color.

Preset pattern key

Page 26: My training report (pdf)

19

Cuts a hole by using the black and white image from the wipe generator.

4.3.1 Chroma keying

Figure 4.4: Chroma keying

4.4 Configuration of production switcher

analog or digital

component or composite

number of inputs

number of m/e systems

m/e functions

mix

mix transition

wipe

Page 27: My training report (pdf)

20

Wipe Transition

Key

Main features

Mixing (Or Dissolve)

Manual or auto transition

Additive or non-additive mixing

Wiping

Number of patterns

Rotational patterns

Bordering mode, hard or soft, and color edges

Direction mode

Modulation mode

Positioner mode

Aspect change

Multiple patterns

Keying

Luminance, Linear, Chroma, and preset pattern key

Source selection

Invert mode

Insert selection (video, color or still stores)

Bordering selection (in title keys)

Key shadow features

Page 28: My training report (pdf)

21

Effects memory system

User programming

Learn mode programming

Additional switcher features

Separate mix system (PGM )

Title or downstream keying

Quad splitting

Tally systems

Auxiliary switching

Digital video effects

Custom control

4.5. Digital video effects

Digital video effects are created by digitizing the video effects signal so that it can be

stored, retrieved, and manipulated in digital format. These effects are generated by

digital video effects device which can be a separate standalone unit or can be an

integral part of production switcher.A typical digital video effects system consists of

the control panel, controller electronics to generate the required control signals to

define the effect, and the digital video processor which uses the control signals to

control the video signal in the digital domain producing the desired effect.

3d creativity

Imaging effects

Curvilinear effects

Painterly and particle effects

Slabs

Border effects

Page 29: My training report (pdf)

22

Lighting effects

Shadows

Dynamic recursive effects

Imaging effect

Ability to blur, and skew an image in both horizontal and vertical directions.

Unique matt colors for any and all layers in the system with adjustable Hue,

Luminance, Saturation and Opacity.

Curvilinear effects

Warps– Creation of bursts, radial bursts, slates, ramps, arrows, diamonds, rings,

balls, splashes, hour-glasses, and ripples.

Page turns– Allows to create page turns with ability to manipulate the turn’s size

position, axis, and angle of rotation. Different images can be placed on either side

of the page,

Page scrolls– Allows to create page scrolls with the ability to manipulate the

scroll’s size, position, axis, and angle of rotation. Different images can be placed

on either side of the page, with added ability to split the page in a number of

different ways.

Painterly and particle effects

Painterly effects allow modifying the texture surfaces with tiles, bubbles,

corrugation, crystallization effects and both stained and beveled glass effects.

Particle effects allow controlling the amplitude, axis and angle of particles to

create explosions, swirls, blowing sand, bursts, slats, and fuzz effect

Creating rectangular solids such as cubes and slabs. Any channel or input can be

assigned to any side of the cube.

Lighting effects

Automatic highlights, Manual highlights and reverse manual highlights with

control over the light’s X and Y position, aspect, radius, hue, luminance, and

saturation.

Page 30: My training report (pdf)

23

Border effects

Border can be placed on any combination of channels, with completely adjustable

hue, luminance, saturation, width and softness. A wide variety of border “types”

are available, including metallic tubes, frames, and bevels.

Figure 4.5. Light effects

Page 31: My training report (pdf)

24

5. Satellite communication

Figure 5.1: Satellite communication

Satellite history

Started in 1957 in Russia (Sputnik)

Started in India on 14.04.1982 (INSAT 1A)

Advantages

1. Large coverage area (42% of Earth), very high B.W. (Wide band Multi Channel)

2. Terrestrial uncovered pockets like valleys and mountains regions.

3. Uniform Signal.

4. Establish easily for point to point communication.

Page 32: My training report (pdf)

25

5. During critical condition earth stations can be removed and relocated easily, timely

and communication can be established.

6. Satellite Costs are independent of distances, quality of signal is independent of

distances where as it not in the case of optic fibre cables.

Disadvantages of Satellite communication

1. Communication path between TX and Rx is approx.75000 Kilometres

2. The delay 270+270 m sec makes one feel annoying.

3. The delay reduces efficiency of sat. In data transmission during long file transfer.

4. High atmospheric losses above 30 GHz limit carrier frequencies

Satellite orbits

LEOs - Low Earth Orbit

MEOs - Medium Earth Orbit

GEOs - Geostationary Earth Orbit

6. Microwave link, optical fibre link & measurements

A microwave link is a communications system that uses a beam of radio waves in the

microwave frequency range to transmit video, audio, or data between two locations,

which can be from just a few feet or meters to several miles or kilometres apart.

Microwave links are commonly used by television broadcasters to transmit

programmes across a country.

Purpose of live links for coverage

(a) Live News Event

(b) Live Sport

6.1. Links for OB coverage System

(a) Micro Wave Link

Page 33: My training report (pdf)

26

(b) COFDM based digital wireless camera system.

(c) Optical fibre system.

(d) Satellite link (DSNG)

6.2. Wave propagation

Propagation of Radio waves takes place by different modes, the mechanism being

different in each case.

Based on that, it can be classified as:

1. Ground (Surface) waves

2. Sky (Ionosphere) waves

3. Space (Tropospheric) waves

Ground (Surface) Waves

Medium wave (MW) propagates along the surface of the earth. It is normally

vertically polarized to avoid short circuiting of the electric component by the ground.

Sky (Ionosphere) Waves

Short wave (SW) propagates as sky waves. Ionisation of upper parts of the earths

atmosphere plays a part in the propagation of the high frequency waves.

Space (Tropospheric) Waves

Space wave travel more or less in straight lines. As they depend on line of sight

conditions, they are limited in their propagation by the curvature of the earth.

Microwave is a space wave communication based on line of sight. Link between

30/60 Km. in single hop on flat surface at frequency range of 2 to 8 GHz is possible

using microwave.

With antenna fixed on mountain top a link up to a distance of 200 Kms are in

existence. These links have a frequency range between 2 to 24 GHz

Page 34: My training report (pdf)

27

6.3. Antennas

Microwave links are using two kinds of antennas.

1. Parabolic type

2. Horn Type

These antennas are highly directional. Parabolic antenna is the most commonly used

antenna having a feed, which is a microwave guide with its opening placed at the

focus of parabola structure. Wave guide opening is design in such a way to match the

impedance of wave guide to that of free space. Focus of a parabola structure is the

point where all the light energy falling

On it will converge. As both the light and the microwave are electromagnetic wave so

they have similar qualities. This suggests that all the electromagnetic energy will also

converge at focus of a parabola structure. Parabolic antennas used have a size vary in

different antennas.

Figure 6.1: simplified microwave link

Page 35: My training report (pdf)

28

Microwave transmitter link

Figure 6.2: Microwave transmitter

A standard 1 volt Video signal is applied to the transmit video Amplifier PCB input.

This unit processes the signal to the provide all parameters that are required by the

system before passing it to the Video Modulator PCB

Video Modulator

The transmit Video Modulator PCB takes a standard 1 volt composite video signal

(pre-emphasised with sound sub-carriers) and frequency modulates a 70 MHz voltage

controlled oscillator (VCO). The output of the modulator is FM modulated carrier at

70MHz.

Page 36: My training report (pdf)

29

Up-converter

The Up-converter processes the 70MHz IF product of the video Modulator stage to

produce a 1GHz RF signals. The 1GHz signal is amplified and passed through a

1GHz filter before it is applied to the power control stage.

Microwave Receiver Block Diagram

Figure 6.3: Microwave receiver

Down converter

The Down converter processes the 1GHz product of the first Mixer stage to produce a

70MHz IF signal. This 1GHz signal is filtered and amplified before being mixed with

a 930MHz signal to produce a 70MHz IF signal which is then passed to IF AGC stage

where it is further amplified and filtered.

Page 37: My training report (pdf)

30

Figure 6.4: Down converter

Video Demodulator

The Video Demodulator stage contain all the functions necessary to take a video FM

modulated IF signal at 70MHz, demodulate it to composite baseband (i.e. video plus

sound sub-carriers), amplify, de-emphasize and filter the video and separate off the

sub-carriers for subsequent demodulation

Page 38: My training report (pdf)

viii

CONCLUSION

At the DDK Jaipur we learn about the history of broadcasting and the basic idea of the

broadcasting means that what is broadcasting and need of broadcasting we also saw the

studio setup of DDK Jaipur and learn that how the we record the programme and then

after recording how the video is transmitted to satellite and another receiver and

intermediate process between transmitter and receiver means how they are mix the

audio and video and the key that are used on writing on video slide.

Page 39: My training report (pdf)

ix

REFERENCE

Images from Google https://googleimages.com

Content of broadcasting from Katson books(Analog communication)

Keying methods from Slideshare.com.

Broadcasting history from www.india360.com.

Report format from www.rtu.ac.in

Different wave propagation techniques from Antenna and wave

propagation by Dr,K.D. PRASAD

Modulation techniques from Katson books(Analog communication)