Top Banner
Crowdsourcing Predictors of Behavioral Outcomes Presented by Alekya.Yermal(Lead) (10NJ1A0502) I.Aiysha Sham (10NJ1A0514) S.M.V.N.Sowndarya(10NJ1A 0537) V.Padmavathi (10NJ1A0542)
39

Crowdsourcing Predictors of Behavioral Outcomes

Mar 21, 2017

Download

Documents

Alekya Yermal
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Crowdsourcing Predictors of Behavioral Outcomes

Crowdsourcing Predictors of Behavioral Outcomes

Presented by

Alekya.Yermal(Lead) (10NJ1A0502)

I.Aiysha Sham (10NJ1A0514)S.M.V.N.Sowndarya(10NJ1A0537)V.Padmavathi (10NJ1A0542)

Under the guidance of Mrs.D.Usha Rajeswari

Page 2: Crowdsourcing Predictors of Behavioral Outcomes

CROWD SOURCING PREDICTORS OF

BEHAVIORAL OUTCOMES

Page 3: Crowdsourcing Predictors of Behavioral Outcomes

Retrieving bulk of data from crowd

Page 4: Crowdsourcing Predictors of Behavioral Outcomes

Introduction

Text

extText

Text

Text

• Generating models from large data sets—and determining which subsets of data to

mine —is becoming increasingly automated.

• This was accomplished by building a Web platform in which human groups interact to both respond to questions likely to help predict a behavioral outcome

and pose new questions to their peers.

• Result- dynamically growing online survey, this behavior also leads to models that can predict the user’s outcomes based on their responses to the user-generated

survey questions.

• Example: 1) Predict users’ monthly electric energy consumption. 2) Predict users’ body mass index.

Page 5: Crowdsourcing Predictors of Behavioral Outcomes

Existing System

• Statistical tools such as multiple regression or neural networks provide mature methods for computing model parameters when the set of predictive covariates and the model structure are pre-specified.

• The task of choosing which potentially predictive variables to study is largely a qualitative task that requires substantial domain expertise.

• Example:1) A survey designer must have domain expertise to choose questions that will identify predictive covariates.

2) An engineer must develop substantial familiarity with a design in order to determine which variables can be systematically adjusted in order to optimize performance.

Page 6: Crowdsourcing Predictors of Behavioral Outcomes

Disadvantages of existing system

• There are many problems in which one seeks to develop predictive models to map between a set of predictor variables and an outcome.

• One aspect of the scientific method that has not yet yielded to automation is the selection of variables for which data should be collected to evaluate hypotheses.

• In the case of a prediction problem, machine science is not yet able to select the independent variables that might predict an outcome of interest, and for which data collection is required.

Page 7: Crowdsourcing Predictors of Behavioral Outcomes

Proposed System

TEXT TEXT TEXT TEXT

• The goal of this research is to test an alternative approach to model in which the wisdom of crowds is harnessed to both propose which potentially predictive variables to study by asking questions and to respond to those questions, in order to develop a predictive model.

• This paper introduces, for the first time, a method by which non-domain experts can be motivated to formulate independent variables as well as populate enough of these variables for successful modeling.

• Users arrive at a Web site in which a behavioral outcome is to be modeled. Users provide their own outcome and then answer questions that may be predictive of that outcome .

• Periodically, models are constructed against the growing data set that predict each user’s behavioral outcome. Users may also pose their own questions that, when answered by other users, become new independent variables in the modeling process.

Page 8: Crowdsourcing Predictors of Behavioral Outcomes

Advantages of proposed system

• Participants successfully uncovered at least one statistically significant predictor of the outcome variable.

• For the BMI outcome, the participants successfully formulated many of the correlates known to predict BMI and provided sufficiently honest values for those correlates to become predictive during the experiment.

• While, our instantiations focus on energy and BMI, the proposed method is general and might, as the method improves, be useful to answer many difficult questions regarding why some outcomes are different than others.

Page 9: Crowdsourcing Predictors of Behavioral Outcomes

SOFTWARE CONFIGURATION

Operating System : Windows 7, 8.Coding Language: JAVA/JSP Front End design: HTML, CSS, JavaScriptDatabase : MYSQLDatabase Connectivity: JDBCServer: Apache Tomcat V.7

Page 10: Crowdsourcing Predictors of Behavioral Outcomes

Hardware requirements

TEXT TEXTTEXT TEXT

Processor - Pentium –IVSpeed - 1.1 GHzRAM - 256 MB(min)Hard Disk - 20 GBKey Board - Standard Windows KeyboardMonitor - SVGA

Page 11: Crowdsourcing Predictors of Behavioral Outcomes

Main Modules:-

1. Investigator Behavior

2. User Behavior

3. Model Behavior

Page 12: Crowdsourcing Predictors of Behavioral Outcomes

Investigator Behavior

The investigator is responsible for initially creating the web platform, and seeding it with a starting question. Then, as the experiment runs they filter new survey questions generated by the users.

However, once posed, the question was filtered by the investigator as to its suitability . A question was deemed unsuitable if any of the following conditions were met:

(1) the question revealed the identity of its author (e.g. “Hi, I am John Doe. I would like to know if...”) thereby contravening the Institutional Review Board approval for these experiments;

(2) the question contained profanity or hateful text; (3) the question was inappropriately correlated with the outcome (e.g.

“What is your BMI?”).

Page 13: Crowdsourcing Predictors of Behavioral Outcomes

User Behavior

Users who visit the site first provide their individual value for the outcome of interest. Users may then respond to questions found on the site .

Their answers are stored in a common data set and made available to the modeling engine.

At any time a user may elect to pose a question of their own devising. Users could pose questions that required a yes/no response, a five-level Liker rating, or a number. Users were not constrained in what kinds of questions to pose.

Page 14: Crowdsourcing Predictors of Behavioral Outcomes

Model Behavior

• The modeling engine continually generates predictive models using the survey questions as candidate predictors of the outcome and users’ responses as the training data.

Page 15: Crowdsourcing Predictors of Behavioral Outcomes

Architecture

Page 16: Crowdsourcing Predictors of Behavioral Outcomes

System Design

Page 17: Crowdsourcing Predictors of Behavioral Outcomes

Admin

Check

unauthorized user

Yes No

Take Survey

Discard Question

End Process

Add Question

Admin Login

Page 18: Crowdsourcing Predictors of Behavioral Outcomes

User

Check

unauthorized user

Yes No

Pose Question

Poll

End Process

Play Quiz

UserB Login

Rate It

Page 19: Crowdsourcing Predictors of Behavioral Outcomes

Use Case Diagram

Take Survey

Admin User

Pose Question

Add Question

Discard Question

Vote for Poll

Rate It

Play Quiz

Page 20: Crowdsourcing Predictors of Behavioral Outcomes

Class Diagram

Login()_

User namePassword

check valid()unvalid()

UserA Login()

Take SurveyAdd QuestionDiscard Question

UserA process()

UserB Login()

Pose QuestionPlay QuizPollRate It

UserB process()

Page 21: Crowdsourcing Predictors of Behavioral Outcomes

Activity Diagram

Pose Question

Play Quiz

Poll

Take Survey

Discard Question

UserB Login UserA Login

Start

End Process

Add Question

Rate It

Page 22: Crowdsourcing Predictors of Behavioral Outcomes

Sequence Diagram

Admin UserB

System Database

Take SurveyPose Question

Play Quiz

Add Question

Poll

Rate It

Discard Question

Page 23: Crowdsourcing Predictors of Behavioral Outcomes

Sample screens of implementation

Page 24: Crowdsourcing Predictors of Behavioral Outcomes

•Creating user profile

Page 25: Crowdsourcing Predictors of Behavioral Outcomes

•User login page

Page 26: Crowdsourcing Predictors of Behavioral Outcomes

•Home Page

Page 27: Crowdsourcing Predictors of Behavioral Outcomes

User creates quiz question

Page 28: Crowdsourcing Predictors of Behavioral Outcomes

User creates poll question

Page 29: Crowdsourcing Predictors of Behavioral Outcomes

Select quiz

Page 30: Crowdsourcing Predictors of Behavioral Outcomes

Add question to quiz

Page 31: Crowdsourcing Predictors of Behavioral Outcomes

User playing quiz

Page 32: Crowdsourcing Predictors of Behavioral Outcomes

User selects poll

Page 33: Crowdsourcing Predictors of Behavioral Outcomes

Poll question-user

Page 34: Crowdsourcing Predictors of Behavioral Outcomes

User submits poll question

Page 35: Crowdsourcing Predictors of Behavioral Outcomes

Rate quiz

Page 36: Crowdsourcing Predictors of Behavioral Outcomes

Rating

Page 37: Crowdsourcing Predictors of Behavioral Outcomes

Admin login

Page 38: Crowdsourcing Predictors of Behavioral Outcomes

Add question - pool

Page 39: Crowdsourcing Predictors of Behavioral Outcomes

Conclusion

This paper introduced a new approach to social science modeling in which the participants themselves are motivated to uncover the correlates of some human behavior outcome, such as homeowner electricity usage or body mass index.

In both cases participants successfully uncovered at least one statistically significant predictor of the outcome variable.

The main goal of this paper is to demonstrate a system that enables non domain experts to collectively formulate many of the known (and possibly unknown) predictors of a behavioral outcome, and that this system is independent of the outcome of interest.