Top Banner
38
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Presentation1
Page 2: Presentation1

Scholar : Tahira AltafSubject: Applied Research in EducationSubmitted to: Dr Muhammad RamzanDepartment of Educational Training THE ISLAMIA UNIVERSITY OF BAHAWALPUR

Page 3: Presentation1

THIS ASSIGNMENT WILL COVER FOLLOWING TOPICS

Measurement as a tool of research Four scale of measurement Validity and reliability in measurement Statistics as a tool of research Functions of statistics in educational

research Human mind as a tool of research Inductive and deductive logic, Scientific method Critical thinking

Page 4: Presentation1

DEFINITIONS OF MEASUREMENT

According to advance learner dictionary page # 954

“The act or process of finding the size, quality or degree of something is called measurement.

According to Leedy Paul

“Measurement is limiting data of any phenomena substantial or insubstantial so that those data may be interpreted and ultimately compared to an acceptable

qualitative or quantitative standard.”

Page 5: Presentation1

DEFINITION OF SCALES OF

MEASUREMENTScales of Measurement:

According to Gay L.R Mills Geoffrey E and Airasian Peter, in book Educational research P. (145),

“A measurement scale is a system for organizing data so that it may be inspected, analyzed and interpreted. In other words the scale is the

instrument used to provide the range of values or scores for each variable.”

Page 6: Presentation1

SCALES OF MEASUREMENT

Nominal scale. The word nomin Latin

rooted word which means name. Nominal scale measures the nominal variables.

According to Gay. LR. P. (145),

“A nominal variable is also called a categorical variable because the value includes two or more name named categories.”

These variables include sex, employment factors etc

Ordinal Scale According to Gay. LR. P.

(421), “An ordinal scale not only classifies subjects but also rank them in terms of degree to which they process a characteristic of interest.” ordinal scales permit us

to describe performance is higher, lower, worse, better etc

Page 7: Presentation1

Interval Scales: According to Gay. LR. P.

(421), “An interval scale has

all characteristic of nominal and ordinal scales. But in addition it is based upon predetermine intervals.”

Achievement test, aptitude test and intelligence tests are examples of interval scales.

Ratio Scales: According to Gay. LR. P.

(422), “A ratio scale represents

the highest, most precise level of measurement. A ratio scale has all advantage of other types of scales and in addition it has a meaningful true zero point.”

Height, weight, time, distance and speed are example of ratio scales.

Page 8: Presentation1

Scale Description ExampleNominal Categorical Northern, Southern, Dictators,

Democrats, Eye color, Male, Female, Public, Private, Gifted, Students, Typical Students

Ordinal Rank Order and Unequal units

Scores of 5, 6, and 10 are unequal to the scores of 1, 2, and 3.

Interval Rank order and interval units but no zero points

A score of 10 and score of 30 have the same degree difference as a score of 60 and score of 90.

Ratio All of above and true zero point

A woman is 5 feet tall and her friends is two third as tall as she.

Comparison of Measurement Scales:

Page 9: Presentation1

VALIDITY

“Validity refers to the degree to which a test measures what it is supposed to measure and consequently permits appropriate interpretation of scores”.

Page 10: Presentation1

TYPES OF VALIDITY

Content Validity Criterion related validity Concurrent validity Predictive validity Construct validity Consequential validity

Page 11: Presentation1

CONTENT VALIDITY Definition “Content validity is a degree to which a test measures an intended constant area.” In such type of validity first researcher select the target content and then construct the test to check its validity. Content validity has further two types.

Page 12: Presentation1

CRITERION RELATED VALIDITY According Airasian Peter, “Criterion validity is the degree to which

scores on one test are related to scores to scores on similar, preexisting test administered in the same time frame or to the other available valid measure at the same time.”

Page 13: Presentation1

CONCURRENT VALIDITY

“Concurrent validity is the degree to which scores on one test are related to scores on similar, preexisting test administered at the same time”,

Page 14: Presentation1

PROCESS OF CONCURRENT VALIDITY Administer the new test to defined

group of individuals. Administer a previously established

valid criterion test to the same group at the same time or shortly thereafter.

Co-relate the sets of scores. Evaluate the results.

Page 15: Presentation1

PREDICTIVE VALIDITY According to Geoffrey E. Mills, “A predictive validity is the degree to

which a test can predict how well an individual do in a future situation.”

Page 16: Presentation1

PROCESS OF DETERMING PREDICTIVE VALIDITYGay L.R Mills, Geoferry E and Peter Airasian P #(156) describe following procedure for determining the predictive validity. Identity and carefully define the criterion. Administer the predictor variable to a

group. Wait until the behavior to be predicted, the

criterion variable occurs. Obtain measures of the criterion for the

same group. Co-relate the two sets of scores. Evaluate the results

Page 17: Presentation1

3. CONSTRUCT VALIDITY

According to Gay LR. P# 140, “Concurrent validity is a degree to which

a test measures an intended hypothesis. A construct is non observable trait such as intelligence.”

Page 18: Presentation1

CONSEQUENTIAL VALIDITY

Consequential validity is related with consequences that occur from tests. All the tests which have been conducted under the umbrella of consequential validity involve more and more individuals. That’s why consequence of test become more important.

Page 19: Presentation1

FACTORS THAT THREATEN VALIDITY Unclear test direction. Confusing and ambiguous test items. Vocabulary too difficult for test takers. Overall difficult and complex sentence

structure. Inconsistent and subjective scoring methods. Untaught items including on achievement test. Failure to follow standardized test

administration procedures. Cheating either by participants or by someone

teaching the correct answers to the specific test items.

Page 20: Presentation1

TYPES OF VALIDITY (COMPARISON)Types Method PurposeContent Validity Compare content

of test to domain being measure.

To what extent does this test represent the general domain of interest.

Criterion related validity

Co-relate score from one instrument of scores on a criterion measure, either at the same time (concurrent) or different time (predictive).

To what extent does this test correlate highly with another test?

Construct validity

A mass convergent, divergent, and content related evidence to determine that the presumed construct is what is being measured?

To what extent does this test reflect the construct it is intended to measure?

Consequential validity

Observe and determine whether the test adverse consequences for test takers or users?

To what extent does the test create harmful consequences for test takers?

Page 21: Presentation1

RELIABILITY

According to Gay L.R, “Reliability is a degree to which test consistently measure whatever it wants to measure.”

Page 22: Presentation1

TYPES OF RELIABILITY Test retest reliability Equivalent forms reliability Spilt half reliability Scorer rater reliability

Page 23: Presentation1

TEST RETEST RELIABILITY

Test retest is a form of reliability in which one test is conducted in two different timings to the same participants

Page 24: Presentation1

PROCEDURE OF TEST RETESTS RELIABILITY

Administer the test an appropriate group.

After passing some time same test should be conducted to same group.

Correlate two sets of scores. Evaluate the results.

Page 25: Presentation1

EQUIVALENT FORMS RELIABILITY

In such type of reliability two test are conducted which don’t have same test items but they are identical in every way

Page 26: Presentation1

PROCEDURE OF EQUIVALENT FORM RELIABILITY

Administer one form of test to an appropriate group.

At the same session or shortly thereafter administer the second form of test to the same group.

Correlate two sets of scores. Evaluate results.

Page 27: Presentation1

RATIONAL EQUIVALENCE RELIABILITY

Rational equivalence reliability is not established by correlation but it estimates internal consistency by determining how items in the test are relevant with each other.

Page 28: Presentation1

STATISTIC AS A TOOL OF RESEARCHStatistical analysis involves the process of collecting and analysing data and then summarizing the data into numerical form.

Page 29: Presentation1

STATISTICS ANSWERS FOLLOWING QUESTIONS Where center of body of data lies? How broadly the data is spread? How much two or more variables are

interrelated with each other?

Page 30: Presentation1

FUNCTIONS OF STATISTICS IN EDUCATIONAL RESEARCH

Descriptive statistics How much

variability exist in different pieces of data?

How two or more characteristics are interrelated with each other?

Inferential statistics It help

researcher to make decision about data.

It answers the quantitative nature of questions.

Page 31: Presentation1

HUMAN MIND AS A TOOL OF RESEARCH All other tools and logic become useless

without effective involvement of human minds.

Researcher mind is key element in research work.

Page 32: Presentation1

TO MAKE HIS RESEARCH FRUITFUL RESEARCHER FOLLOW FOLLOWING STEPS Deductive reasoning Inductive reasoning Scientific method Critical thinking

Page 33: Presentation1

DEDUCTIVE LOGIC

Deductive reasoning involve essentially the reverse process-arriving at specific conclusion based on general principles i.e observation or experience.

Separate and individual facts leads towards single conclusion

Page 34: Presentation1

INDUCTIVE REASONING Inductive reasoning don not begin with

pre-established truth or assumption. It leads towards examples to principles.

Page 35: Presentation1

SCIENTIFIC METHOD The goal of scientific endeavors is to

explain, predict and control phenomena. This goal is based upon the assumptions that all behaviors and events are orderly and that they are effects which have discoverable causes.

Page 36: Presentation1

STEPS IN SCIENTIFIC METHOD Recognition and

definition of problem

i) Sensationii) Conceptioniii) Perceptioniv) Observation Formulation of

hypothesis

Collection of data

i)Observationii) Interviewiii) Questionnaire Analysis of data Conclusion

Page 37: Presentation1

CRITICAL THINKING

It involves following steps. Verbal reasoning Argument analysis Decisional making Critical analysis of prior research.

Page 38: Presentation1

THE END