ERASMUS PLUS 2015 SECTOR SKILLS ALLIANCES AGREEMENT No. 2015 – 3212 / 001 – 001 PROJECT No. 562634-EPP-1-2015-IT-EPPKA2-SSA Deliverable Number: 6.2 Title of Deliverable: Evaluation and Monitoring Plan WP related to the Deliverable: 6 Dissemination Level: (PU/PP/RE/CO)*: PU Nature of the Deliverable: (R/P/D/O)**: R Actual Date of Delivery to the CEC: 16/06/2017 PARTNER responsible for the Deliverable: UVa-Education WP starting month M16 WP ending month M19 Partner Contributor(s): SI4LIFE, UVa-Tech, Uva-Nurses, OMNIA, VE-II, AGE, GGALERY, AGE Partner Reviewer(s): SI4LIFE, UVa-Tech, Uva-Nurses *Dissemination Level: PU=Public CO=Confidential, only for members of the Alliance (including Commission Services). PP=Restricted to external subjects in confidential mode (including Commission Services) RE=Restricted to a group specified by the Alliance (including Commission Services). **Nature of Deliverables: R=Report P=Prototype D=Demonstrator O=Other
73
Embed
ERASMUS PLUS 2015 SECTOR SKILLS ALLIANCES · Deliverable 6.2 – Evaluation and Monitoring Plan 562634-EPP-1-2015-IT-EPPKA2-SSA CARESS Project 3 of 73 4 I.T.C. “Vittorio Emanuele
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
1 ABSTRACTThisdocumentillustratestheevaluationprocesssettledintheCARESSproject,addressingtwofacetswithintheproject:theevaluationoftheframeworkandguidelines,aswellastheevaluationoftheCaressPilotsandtheirresponsivenesstotheneedsofthejobmarket.Moreover,thedocumentdefinesandintegratesalist of criteria and indicators thatwill beused in T6.1, and in T6.3 tomake theevaluationprocessmorereliable.Throughthedocument,wewillpresentsomeexamplesofbasic instrumentsthatwehavedevelopedfortheevaluationofthequalityandtheefficacyoftheCARESSpilotsinFinland,ItalyandSpain(WP4).Pilotswill be evaluated in their threemain dimensions: blended learning, e-learning and non-formal/informallearning,throughtheuseoftheCARESSVirtualCommunityofPractice(VCP).This document also presents an “evaluation toolkit” of basic instruments that have been developed forpilots’evaluation inthethreecountries involved.Theseassessmenttoolscanbeusedbefore,duringandafter the implementationof theaforementionedpilots, tocollectvaluabledata,whichwillbehelpful forformativeandsummativeevaluationpurposes.AsthedesigndocumentsforthePilotsarenotbeingreleasedalready,weneedtotakeintoaccountthoseevaluation instrumentsareprovisional, andmay suffer someminormodifications tobetter fit thePilot’sdesign.Thepilotdesignerscanusethe“evaluationtoolkit”astheyarepresentedhere,translatethem,ordevelopsimilartoolsbasedonthese,accordingtotheirspecificneeds.
6 INTRODUCTION:STRUCTUREANDAIMSOFTHEDOCUMENTThe CARESS Evaluation and Monitoring Plan is a document that describes how we will monitor andevaluate theCARESSpilot courses carriedoutwithin theCARESSproject, and the criteria and indicatorsthatshouldbeusedtoevaluatetheCARESSEUframeworkandguidelines.Afterwards,thosecriteriaandindicatorswillbeusetoevaluatetheresponsivenessofthetrainingprovidedwithintheCARESSprojecttotheneedsofthejobmarketanddrawdownacomprehensivefinalevaluation.Aswe can see in figure 1, whereas the ultimate purpose of Task 6.1 is to gather feedback from pilots’designerstovalidateandprovetheuseoftheCARESSEUframeworkandtheguidelinesdeliveredinT.3.1and3.2,task6.2isfocusedontheevaluationplanandmonitoringofthepilotsanditsfunctioning.Besides,weprovideadescriptionof theprocedurethatwill follow,aswellasasetof indicatorstoassesstheEUframework and the guidelines for the design of the pilots. Finally, a description of the instruments andguidelinesthatwillbeusedtoassessthedesignandtheenactmentofthepilotsisprovided,withtheaimof assessing the responsiveness of the training developed to the jobmarket needs according to home-healthcareprofessional’sperspective. According to the aforementioned ideas, a comprehensive final evaluation of the pilots, linked with theresponsiveness to jobmarketneeds, (i.e., task6.3),will alsobeevaluated following specific,measurableindicatorsdefinedinthisdocument.
Animportantfirststeptowardsthataimisthedescriptionofthescopeandthemethodofdatacollectionand analysis that will follow. The evaluation plan will clarify the procedure whereby the informationgatheredfromthepilot'sdesignhavebeenusedtoassesstheEUframeworkandtheguidelinesaswellasabouthowthepilot’sexperienceshavebeendevelopedinFinland,ItalyandSpain.Asaconsequence,wewanttoknow,towhatextenttheactivitiesandtasksdevelopedinthenationalpilotsarelinkedwiththeintended learning outcomes. Moreover, we want to illuminate whether the pilots were operatingcoherentlytotheprogram’sdesign.Additionally,theprocessevaluation,alongwithoutputandshort-termoutcomeinformation,willhelptoclarifywhetherpilotsrespondtotheintendedCARESSprojectobjectives.It isworthnoting that theevaluation topicshavebeendefined taking intoaccount the listof short-termresultsand longtermoutcome indicators thatwereexplicitlymentioned in theCARESSprojectproposal.These indicators are specific for eachactivity tobeevaluated, and correspond to thedifferent aimsandgoalsregardingthefunctionalitiesoftheelementtobeassessed.Someoftheindicatorscanbetrackingbycompiling thedata requested (e.g. that the learningmaterials arewriting in two languages, thenationallanguage(Italian,FinnishandSpanish)andEnglish,whileothersmayneedthedevelopofaparticulartool.Nevertheless, all of them are linked with the quality standard for the whole project, connecting theevaluation with the assessment of the project quality. Thus, quantitative and qualitative indicators aredescribedinsection7.3.This report is structured as follows. Section 7 provides a general description of the context and theevaluation methodology guided by the CSCL-EREM evaluation framework (Jorrín-Abellán, 2013) withinCARESSproject.Besides,clarifieswhatitisthescopeoftheevaluationattendingtothedifferentscenariosandobjectives,withasectiondevotedtotheevaluationofEUframeworkandguidelinesandanothertotheevaluationofthenationalPilotsandtheposttrainingresponsivenessofthejobmarket.Likewise,weshowedageneraloverviewof theevaluationprocesswithin theCARESSproject.Also in this section,wepoint out the evaluation objectives linked to criteria and indicators according to the status of theassessmenttoolsandthetimewheretheevaluationwillbecarriedout.
Then,section7.2explainshowtheevaluationofthequalityandtheefficacyofthepilot’scourses inVETwillbeperformedinItaly,SpainandFinland.Ontheonehand,weprovidedadescriptionoftheevaluationtimelineand theevaluation instruments thatare supposed tobeused forgatheringevidenceduring thedesignandtheenactmentofthepilot’scourses.Ontheotherhand,wehighlightpartner’sresponsibilitiesandrolesofCARESSpartnerswithrespectontheevaluationtasksthatwillbecarriedout.Finally, section 11, the annexes, draws down a draft of the “evaluation toolkit”: a set of assessmentinstrumentsandguidelinesdevelopedwith theaimofhelpingCARESSpilots'designers togatherdata toassess theusefulnessof theEU frameworkand theguidelines todesignVETpathways.Moreover,aboutthepilots'qualityandefficacyandresponsivenesstojobmarketduringandposttraining.It is worth notice that the evaluation toolkit showed just a first draft of assessment instruments (i.e.,observations,questionnaires,interviews,etc.)thathastheaimofguidingevaluatorsduringtheevaluationtasks. The ideas provided in the “evaluation toolkit” could suffer modifications to better adapt theinstrumentstotherealneedsofthecontextandthe informants (i.e., teachers,students,VETexperts)oftheevaluationtasksperformed.
7 EVALUATIONCONTEXTANDMETHODOLOGYThe evaluation context and scope is divided attending the two different scenarios. First of all, theevaluationoftheusefulnessoftheEUframeworkandguidelines(theWizardtool)duringthedesignofthepilotslearningdesign,describedinsection7.1.Secondly,theevaluationofthenationalpilotsandthejobmarketresponsivenessafteraperiodofsix-monthposttraining,describedinsection7.2.Withtheaimofclarifying the scopeof theevaluation taskswithinCARESSproject, section7.3will showa tablewith theassessment objectives linked with criteria, sub criteria and indicators according to the status of theassessmentinstrumentsaswellasthetimewheretheevaluationwilltakeplace.
DuetotheinherentcomplexityofevaluationitselfandthedifficultyofhealthcarescenarioscarriedoutindifferentplacesacrossEurope, it is convenient tousea theoretical framework toguide thecollectionofevaluationdata,itsanalysisanditsinterpretation.Specifically,theevaluationofCARESSprojectfollowstheEREM model (Evaluand-oriented Responsive Evaluation Model) (Jorrín-Abellán, 2009). The EREM is aframework that relies on a responsive evaluation approach, empowering the idea of conductingevaluationsfocusedonthephenomenatobeevaluated(evaluand)ratherthaninthefieldofexpertiseoftheevaluators(Stake,2004).Itisworthnotingthatanevaluandcanbedefinedastheentity,theprocess,theactivity,ortheproductbeingevaluated.Besides,theEREMtriestobeaframeworkforhelpingintheevaluation of Computer Supported Collaborative Learning programs, innovations, learning and teachingstrategies,educationalresourcesaswellasresearchprojects.This evaluationmethod follows an interpretative research perspective that does not pursue statisticallysignificant results or generalizations. Rather, it aims at a deeper understanding of the particular
phenomenaunder study. Inour case [1] theusefulnessofCARESSEU framework tooland thewizard tohelpteachersduringthedesignofthepilotsand[2]thequalityandefficacyofthelearningcoursescarriedoutthroughthepilotsperformedinItaly,SpainandFinlandacrossCARESSproject.Conductinganevaluationthat is responsivetotheneedsofdifferenthealthcarepractitionerswillensurethat evaluative information ismeaningful andmore likely to be used for programme enhancement andimprovement.Accordingtotheaforementionedideas,wethinkthatEREMcouldbeausefulframeworkwhichcanhelpusin the creation of an evaluation design suitable to the particular necessities derived from the differentcontexts in where the pilots will take place. Furthermore, the responsive evaluation model has beenimplementedinvariouspolicysectors,includingthefieldofhealthcare:
● To the evaluation of clinical skills assessment and training (CSAT) programme at the Faculty ofMedicine(Curran,Christopher,Lemire,Collins&Barret(2003).
InthecontextofCARESSproject,theevaluationhastheaimofprovidingevidenceaboutthequalityandtheefficacyofpilotsandtheirresponsivenesstotheneedsofthejobmarketandabouttheusefulnessofCARESS EU tool to help practitioners during the design of VET courses. Therefore, this evaluation has athreefoldevaluandthatincludes:thequalityandefficacyofpresencecoursesinhomecare,theCARESSe-learning platform and Virtual Community of Practice acceptability and the usefulness of EU CARESSframework tool and their guidelines to support teachers during thedesignsof learningpathways inVETscenarios.
Aroundtheevaluandconcept,theframeworkstructurestheevaluationdesignintothreedifferentfacets:perspective (why the evaluand is evaluated), covering themain goal pursued andother significant openquestions;ground(wheretheevaluandisevaluated),gatheringtheinformationaboutthecontextsandtheparticipants;andmethod(howtheevaluandisevaluated),showingthedatagatheringtechniquesandthedocuments that support the conclusions. Figure 1 illustrates a general representation of the EREMframeworkincludingtheevaluand(inthecoreofthefigure)andthedifferentfacetsaroundtheevaluand.ThisgraphicalrepresentationhasbeenobtainedusingtheCSCL-EREMwebtool1.TheperspectivefacetisorganizedaroundthedefinitionofIssues.Anissuecanbeunderstoodasatension,atroublingchoice.Oneissuenaturallyarisesfromtheobjectivesofthepresentevaluation.Itisstatedasfollows:[Issue1]HowtheCARESSEU framework tooland thewizardcanhelp teachersduring thedesignof thepilots?[Issue 2] Can training courses based on competences and active methodologies be helpful to improvenurses/OSSandpracticalnursesclinicalpracticeandtransversalskillsinelderlycare?
• AEuropean framework forvocationaleducationandtraining (VET) in the fieldofhomecare, thatreflects the professions and occupations actually involved in domiciliary care, thecompetencies/skills required foreachprofession,andtherelated learningoutcomes (referring toECVET);
• Guidelines supporting different stakeholders working in the field of homecare in the use theEuropean framework itself for designingVET curriculawhich take into account contextual needsandalreadyexistinginitiatives.
Framework andGuidelines are delivered inWP3 andwill evolve during the lifespan of the project. Thisprocedure of progressive refinement requires scheduling different evaluation slot following thedevelopmentoftheresults.
Details about the related indicators are provided in Table 3. Specific Questionnaires and Interviews toevaluatetheefficacyandusabilityoftheFrameworkaswellasfortheGuidelines/WizardwillbedesignedinT6.1alsoincludingtheresultsofthepreliminaryevaluationoftheframeworkusabilityperformedatM14andtherelatedquestionnaire(D3.1.1).
The Wizard is a user-friendly interface allowing for a personalized "step-by-step" access to thefunctionalities of theweb-based framework. In this way different stakeholderswill be supported in theaccess,use,andnavigationoftheFrameworkandinperforming3mainactivities,asdescribedinD3.2.1byfourUseCases:
Case0andCase1hasbeen implemented in its first releaseandwillbeevaluated following the timelinethatwillbeprovidedinTask6.1.
Case 2will be implemented by the end of the second year of the project, in order to allow frameworkreview/integrationinthethirdyear.Beforeitsrelease,itwillbetestedasplannedinT6.1.FeedbacksfromuserswillbecollectedbyM32inordertopossiblyrefinefeaturesofCase2forthefinalrelease.
Specificguidelinesfor“CARESSFramework-supported” instructionaldesignhavebeendelivered inD3.3.2intermsof“DesignPatterns”,i.e.“recurrentsolutions”whichcouldbeusedbyVETinstructionaldesignersand teachers when approaching the challenge of designing a Compensative Learning Module to fill aspecific“competencegap”thataffectsacertainHHCP(seefordetailsD3.3.2).
1. SpanishPilotcontextdescription:ThedegreeofNursing, inordertoachieveallofthecompetenciesprovidedwillberequiredtocomplete240ECTS (eachECTS,byagreementof theGoverningCouncilof theUVa,willhaveanequivalenceof25hours), which will be distributed as follows (Distribution in accordance with the guidelines of the RD1393/2007):
For 2016/2017 there are 57 universities in Spain with Faculties of Health Science, Faculties of Nursing,FacultiesofMedicineorUniversitySchools standingaDegree inNursing,with4750newopeningsbeingoffered. Nevertheless,when considering specifically the homecare assistance of the elderly, several skillgapshavebeenfound.Theseskillgapsareanobstacleandmaketheinvolvementofthenurseswiththisquestiondifficult.Theseskillgapsare:
● Theneedofsupporttooperatetechnologicaldevicesforhomehealthmonitoring. ● Theneedtosupportusersintheself-managementofhis/hermentalhealth. ● Maintainsocialandfamilyrelations. ● Solvingdailyproblemsorknowingwheretoseekforhelp. ● Significantimprovementincoordinationcapacitytoplanactions. ● Meetings with health teams, and keeping in touch with other professionals who also deal with
3. FinlandPilotcontextdescriptionFinland pilot will target practical nurses (EQF4). They must command value competence, occupationalethics,toleranceandproblem-solvingskills inthefield, inadditiontonursingandcareskills;skillsgaponthesetransversalcompetenceshasbeendetectedinFinland.
According with the project, the implementation of the training courses has three different spaces orapprenticeshipareas:BlendedlearningEachVETpathwaydesignedwillbeimplementedinablendedlearningsituationinItaly,Spain,andFinland.Table2 illustrates the location inwhich thepilotswill be carriedout, the target, aswell as thepartnersinvolved.
Presence learning should be based on different methodologies (such as education in the workplaceenvironment,school/workturnover,etc.)andstrategies(i.e.,casestudies,criticalincidents,problem-basedlearning,etc.).E-learningSpecific learning activities will be set up in the e-learning course, fostering learning with differentmethodologies(i.e.,casestudy,criticalincidents,roleplaying,etc.).Learning activities will be supported by the presence of both technical and didactic tutors: an emailticketingsystemwillbeimplementedfortechnicalquestions,whileeducationalissueswillbecoordinatedandmanagedwiththesupportofqualifiedexpertsfromeacheducationalsetting.Tosupporte-learningdelivery,ane-learningplatformwillbedevelopedwithMultilingualOpencontents.Informal/nonformallearning:CARESSVirtualCommunityofPracticeThe CARESS Virtual Community of Practice Platform will be integrated with the e-learning platformdeveloped, so as to facilitate the transition of students from formal to informal/non-formal learningactivities.Thegoalof theevaluation is tosupport theCARESSresearchprocess,byknowinghowthedesignof thepilots has been done, how the pilots’ training has been carried out and whether this training programanswers to the purpose of the Caress project. Thus, CARESS pilots will be conducted with the aim ofovercomingthegapsbetweenjobmarketandlearningprogramsfordifferenthealthcarepractitioners.Thecourseswill be evaluated attending its learning design, specific goals, learning outcomes, the quality ofmaterialsandactivities,aswellasteachers’andstudents’satisfaction.Tobeevaluated,eachnationalpilotwill havea referenceperson to collect the informationandmanageevaluationissues.Thispersonisintendedtobelocatedineachtrainingcentre.Themainkeyissueswheretheevaluationwillbefocused,arethosedescribedintheCARESSproject.Theaimoftheevaluationprocessistocollectdata(fromaqualitativeandquantitativepointofview)toanalysethosefactorsandcontributetothepurposeoftheproject.
7.3 EvaluationtaskswithinCARESSproject:generaloverviewInordertobettercontextualisetheevaluationwithoutlosingsightoftheobjectivesofthis,wehavemadeup a table summarising the aims, criteria, sub-criteria and their link with indicators and evaluationinstruments. Moreover, we provide some insight into the status of developing of different evaluationinstrumentswithin CARESS project. Thus, it can see how some instruments are showed as drafts in thesectionofannexesinthisdocumentmeanwhileothers’arestillunderdeveloping.Resuming the different phases of the evaluation process, that will take into consideration the differentcomponents of the CARESS project. The partners will work to evaluate
afterthem,monitoringitsefficacyandsustainmentafterthelaunchphase.As described in D3.6 the VCP design process include several steps. To promote the sustainability of theCARESSVCPafterthelaunchperiod,theuserwillbeinvolvedinnewrolesandresponsibilities.TheuseroftheVCPcouldbe involvedassumingdifferentroles includingVCPMemberandVCPSeniorMember(VCPMemberpromotedbytheModeratorortheTechnicalAdministratortobemoderatorofaspecificgroupwithin the community). During the launch phase of the VCP, that will take place during the pilots, thestudentsinvolvedwillbeenrolledintheVCPasVCPmembers.Aftertheendofthepilots,tomaketheVCPself-sustainablethestudentswillbe invitedtoassumetheroleofVCPseniormembersandstimulatetheinvolvementandtheengagementoftheVCPmembers.Thesuccessindicatorsaredesignedalsotodetectthe efficacy of the VCP to be self-sustainable through the involvement of the VCP members in thecoordinationoftheworkofthesubject.The indicators marked in bold (see Table 3) are those collected from the short and long term CARESSoutcomeindicators.ThischartisintendedtobeaguidetothedifferentelementstobetakenintoaccountwithintheevaluationtasksinWP6.
The uploaded contents(documents or othermultimedia content), theuseofaccessingresources(e.g.,watchingthisorthatvideo);uploadingcontents(this or that document);the active participation incertain communities orvirtual meetings (postingmessages, joining ateleconference).
- NumberofpractitionersinvolvedintheVCP;- Level of activity, i.e. level ofinteractionsamongmembers- Connections with professionalassociations;- Connections with final and endusers
Analysis of data log filesthat will be trackedautomatically by thesystem
Annex7,10
SeepossibleindicatorsinTable4.
Draft6.2
Attheendofpilots
M34
Members’satisfaction
The effectiveness of VCPthrough the membersatisfaction
- The meeting of the community’sinitialobjectives.
- The added value provided to thetarget as a group, in terms ofnew/enhanced organization, tools,opportunities derived from thebelongingtotheVCP
- The platform allows for aneffective management ofinteractions and networks both ata national level at internationallevel
- The platform allows for aneffective management of inter-professional exchanges andcontacts
Attheendofpilots
M34
VCP platformusability
Positivefeedbacksofusersaboutplatformusability
- Functionalitiesoftheplatformallowfor implementing all the activitiesenvisaged by the instructionaldesigndocuments.
Questionnaire/
Interview.
To bedeveloped
During/ at the end ofpilotsenactmentandsix-monthsaftertraining
Effectiveness ofcompetencerecognitiontools
Effectiveness ofDiary/portfolio
- Achievement level of learningoutcomes defined according toECVET- Thebenefitsprovidedtoindividualssuch as learning/job/networkingoccasions derived from thebelongingtotheVCP
Annex11,12 Draft6.2 During/ at the end ofpilotsenactmentand sixmonthsaftertraining
Effectiveness of tracking/self-reporting
Achievement level of learningoutcomes defined according toECVET
Analysis of data log filesthat will be trackedautomatically by thesystem
To bedeveloped
During/ at the end ofpilotsenactmentandsix-monthsaftertraining
8 CONTEXTANDEVALUATIONMETHODINT6.3Withtheaimofassessingtheevaluand,threestudiesinauthenticscenarioswillbecarriedoutfromMay2017untilFebruary2018.ThesestudieswereformalisedusingtheEREMevaluationmodel(seesection7)andinvolvedrealendusers,likeeducatorsandstudents.Weaimtoadeeperunderstandingoftheparticularphenomenaunderstudy, inourcase,thedesignandtheenactmentofthreenationalpilotsinVETtofosterstudentsofnursing,OSSandpracticalnurse’sskillsandcompetences inelderlyhomecare.Besides, thisevaluationproposalwill showhowteachers supportlearningactivities in face to face,on-lineandnon-formal and informal courses, andallows todetect thebenefitsandlimitationstoassessthequalityandtheefficacyoftheenactedpilots.The aforementioned issue 2 (Can training courses based on competences and active methodologies behelpfulto improvenurses/OSSandpracticalnursesclinicalpracticeandtransversalskills inelderlycare?)willbeexploredthroughinformativequestionsgroupedintoseveraltopics,bothdefinedbyusatthestartoftheevaluation.Alsoemergenttopicswillbetakenintoaccountwhilegatheringandanalysingdata,ina“progressivein-focusapproach”(Stake,2010).Thesetopicscanbeillustratedbythefollowing:Topic1:InstructionaldesignoftheVETcourses:toevaluatethedesignofthecourse,weaimtoanalysetheinstructionaldesigndocumentsforformal,non-formalandinformallearningaccordingtothenationalVET pathways targeting Social Health Operators, practical nurses and registered nurses. Thus, in thisparticularway,weneedtoknowwhetherthecontentsarewell-suitedtothecourse’sobjectives.Also,wewillgatherinformationabouttheinstructionalstrategiestobeused,aswellastowhatextentthedesignofthecoursesacrosscountries isgoing to followtheEQAVET indicators.Finally,wewill collect informationaboutstudents’ impressionsaboutthedesignofthecourse, their learningexpectationsand if thedesignandthecontentsofthecoursearewellsuitedtotheneedsoftheworkmarket.Topic 2: Participation and students’ performance: we pretend to gather information regarding theenactment of the pilots. We will ask students about the course contents, activities, methodologiesemployedaswellastheirperceptionsabouttheexperienceandsatisfactionwiththecourse.Furthermore,wewillfocusonunderstandingifthecoursesperformedhavecontributedinsomewaytoovercometheirlearninggaps.Wealsowillaskaboutthecourseutilitytocovertheneedsperceivedintheirtraining/job.Ontheotherhand,wewillgatherdataabouttheteachers’perceptionsofthepilotcarriedout.Topic3:Teacher’scompetenciesandskills,wherewepretendtoputemphasisinevaluatingspecificissuessuchastheadequateknowledgeonthesubjectshowedbyteachers/tutors,theclarityoftheexplanations,theirlevelofpreparationofatopic,theirinteractionswithstudentsregardingquestionsandissuesraisedduringthetraining,etc.Topic4:Qualityandusefulnessoftrainingmaterials:Qualityandusefulnessoftrainingmaterials:thatwillbe used in the training courses from the participants and the teacher's point of view in the learningexperiences.
Topic5:NonformalandinformallearningthroughCARESSVCP:avirtualcommunityinhealthcarerefersto a “group of people (and the social structure that they collectively create) that is founded ontelecommunication with the purposes of collectively conducting activities related to health care andeducation” (Demiris,2006,p.179). Suchactivitiescanbesupportedbyaplatformaimedatallowhealthprofessionalstodiscussonelderlycareissuesandproblems,sharingdocumentsaswellasconsultingwithexpertsandsustainingrelationshipsbeyondface-to-faceevents.CARESSVCPplatformwill supportnon-formal and informal learning recognitionaccording to threemaintypesofevaluationtoolsandfunctionalities(seeD.3.6):a)Usertrackingfunctionalities:wewillgather informationcomingfromtheanalysisofdata logfilesthatwillbetrackedautomaticallybythesystem.Thisway,weobtaineddataforevaluationpurposesfromthefinalusersoftheCARESSe-learningplatformandVCPusers.Itisworthnoticingthatdatacouldbeexportedinstandardformatswiththeaimofmakingitaccessibletothepilots’teacherstohelpthemintheidentificationofcompetencerecognition.Apotentialtrackingdatacouldbetheuploadedcontents(documentsorothermultimediacontent),theuseofaccessingresources(e.g.,watchingthisorthatvideo)andtheactiveparticipationinparticularcommunitiesorvirtualmeetings(e.g.,postingmessages,joiningateleconference).Table 4 illustrates some potential tracking data that could be obtained from the use of the e-learningplatformaswellasfromtheVCP.Furthermore,annex13providesadetaileddescriptionaboutthetrackingindicatorsthatcanbeobtainedfromtheCARESSe-learningplatform.
Trackingindicators VirtualsettingVirtual Community ofPractice
e-learningplatform
Uploadedcontents(documents,multimedia,etc.) x Accessingresources(numberofvisits,lastaccessed)
x x
Postingmessages x xJoiningateleconference x Activitysubmissions xNºofEventsubscriptions x Enrolledstudents x xBadgesachieved x Create/Joinagroup x Inviteuserstoagroup x Nºofsearchings x Nºofinvitationstojoinagroup x Nºofcreateddiaries x Nºuploadedcontents(documents,multimedia,etc.)inthegroupmoderated
Nºofusersthatjoinedagroup x Nºofteleconferencecoordinatedinagroup x NºMessagesPostedinagroup x NºActivitysubmittedinagroup x NºEventsubscriptedinagroup x
b)VCPexperiencediary/portfolio: is anorganised collectionofmaterials that presents and verifies skillsandknowledgeacquiredthroughexperience.AccordingtoCedefop,2015,“Itisofparticularrelevanceforvalidatingnon-formalandinformallearningbecauseitallowstheindividualcandidatetocontributeactivelytothecollectionofevidenceandalsooffersamixofapproachesstrengtheningtheoverallvalidityoftheprocess”(p.61).Aportfoliomightincludedocumentssuchasabstracts,performanceappraisals,referencefromcurrentandpastemployers,etc. Inthecontextofthisevaluation,anexperiencediary/portfoliowillbeused toallowstudentsdescribehis/her thoughtsandexperienceswithin theVCP (e.g. lessons learnt,discussions,createdcontacts,activityreflections,etc.)AproposalofthistoolwillbepresentedinD.5.1.2“IntegrationofVCPplatformwithe-learningplatform”.Thistoolwillbeusedforevaluationpurposeswiththeaimofcollectingdatafromfinalusersabouttheirpersonalexperiencesrelatedtoelderlycareinanon-formalandinformalcontextaswellastoassesstheircompetenceacquisition.TheVCPexperiencediarywillallowforapersonaldescriptionoftheuserexperienceasamemberoftheVCP.TheVCPmembercouldfill-inperiodicallyadiarywherehe/shecoulddescribehis/herownexpertiseintheVCP(e.g.,activities,discussions,lessonlearnt,createdcontacts,etc.).Theexperiencediarycouldbeprovidedwith questions or suggestions to encourage the textual description of the personal experienceandthenewknowledgeachieved.Thechronologicalversioningoftheresultingdiaryshouldbetrackedandpresentedinaformattoallowtobeanalysedforevaluationpurposes.Besides,itispossibletocreatesometemplatesintheVCPthatmaymakeiteasiertodescribetheuser’spersonalexperience.Also,thesoftwarethatwewillusefortheplatformallowstheissuingof“badges”totheparticipantswhencarrying out specific tasks in the platform. This can be used for generating an “activity report” of theparticipantthatcanbecomplementedwithinformationprovidedbytheparticipantthroughtheirreflectivediary, information that couldbe labelled tohelp furtheranalysis. In addition, the categories let theuserfilter the entries depending on different categories. These categories could describe just personalexperiences or could be aimed at competencies recognition. The use of a blog-like activity (where theparticipants would provide links to, e.g., the artefacts created during their interaction with the VCPplatform)couldhelptoanalyseaparticularsuccessindicatorfortheVCP.c)VCPusers’acceptanceandusage:forevaluationpurposes,wewillusetheTechnologyAcceptanceModel(TAM)(Davis,1989)toassessabouttheusersCARESSVCPperceivedusefulnessaswellastheirperceivedease of use. The Technology Acceptance Model was developed to predict and explain, informationtechnology acceptance and usage. Themain topics to be look at according with thismodel are systemquality, service quality, and perceived usefulness, perceived ease of use and user satisfaction with thesystem.
Moreover, we will use the “Think Aloud Protocol (TAP)” as a research strategy commonly used inpsychologicalandeducationalresearch(Lewis,1982;Charters,2003).TAPprotocolisdevotedtoinvolvingparticipantsthinkingaloudmeanwhiletheyareperformingasetofspecifiedtasks.Participantsareaskedtosaywhatevercomesintotheirmindastheycompletethetask.Thismightincludewhattheyarelookingat, thinking, doing, and feeling. This gives observers’ insight into the participant's cognitive processes(rather than only their final product), to make thought processes as explicit as possible during taskperformance. To provide evidence of the user's acceptance and usage of the CARESS VCP partnersresponsible for the evaluation issues will need to design a process, which takes into account someconcerns,byselectingasuitabletask.Accordingtotheformalresearchprotocol,allverbalizationswillbetranscribedandthenanalysed.It is worth notice that mix method will be used as a strategy for collecting information from differentsourcesanddifferentmoments.The finalaim is tohaveamorecompletepictureof thePilotsworkandtheir strengthsandweaknessesasaneducationaltoolformanagingtheCARESSprojectgoalofovercometheskillmismatchinhealthcareprofessionals.Figure 3. Connections between data gathering techniques, evaluation happenings and topics.
According to the aforementioned ideas, the EREM highly encourages to use mixed data gatheringtechniques and multiple informants in order to obtain different perspectives about the evaluand, thusenriching theevaluationprocess (Green,Benjamin&Goodyear, 2001). Specifically, in this evaluationwe
will combinequantitativeandqualitativedatacoming from4 typesof informantsandsources (teachers,students, CARESS Virtual Community of Practice and CARESS e-Learning platform), using different datagathering techniques such as interviews, questionnaires, log file analysis, observations and reflectivediaries,aswecanseeinFigure3.Then,evidencecollectedfromdifferentsourcesand informantswillbeanalysedfollowingatriangulationapproach (Guba, 1981), in order to reach the global conclusions of this evaluation, increasing thetrustworthinessofthefindings.Likewise,wewillperformananalysisofthecontentofthedocumentsprovidedbythepilots’instructionaldesigners, the transcriptions of the interviews, the reflective diaries, the observations of the pilots’enactment and the open questions coming from the questionnaires. A quantitative analysis will beperformedusingdatacomingfromtheLikertquestionnairesaswellasfromthelogfilesobtainedfromtheCARESSVCP,asitisillustratedinFigure4.
8.1 DescriptionofthetimelineandtheevaluationinstrumentsBeforewedescribetheevaluationmethodologythatwilltakeplace,wethinkitisimportanttogobrieflyovertheevaluationphasesandprovidedageneraloverviewoftheassessment instrumentsthatwillbeused within the evaluation of CARESS pilots in VET. Furthermore, in this section, we identify a set ofevaluationresponsibilitiesthatmightbeassumedbyCARESSpartnerswithdeclaredeffortinWP6.AsitisillustratedinFigure3,theCARESSevaluationofVETpilotswillbedividedintotwomainmoments.Afirstevaluationhappeningcalled[H1]:Design,wherewewilllookatteachers’coursedesigns,thatwillinformtheDesignofthePilots.Thisphaseispriortothepilots’launch.Asecondevaluationhappeningcalled[H2]:Enactment,whilethePilotsarealreadycarriedout.Inturn,itisdividedinthreemainmoments:
a) The beginning of the Pilot. The evaluation team will collect a student questionnaire, a courseobservationandateacherinterview
b) Midtermof the Pilot. During this period, a classroomobservationwill take place for each Pilotcourse.Also, a TAP (ThinkingAloudProtocol) questionnaire regarding theuseof theE-learningplatformandtheVCPwillbefilledbytheevaluators.
c) FinalofthePilot.Wewillcollectthetrainingmaterials,teacher’sfinalreflectionsdiary,students’and teachers’ final questionnaire, and a final teacher interview. Additionally, we will collect astudentdiaryregardingtheusetheVCP.
Figure 5 illustrates a third evaluation happening [H3]: Post training, in which we will collect data toevaluate the impact after six-month training. This way, we will perform a student's interview and aquestionnaire with the aim of assess the impact of the pilots training considering the professionals’satisfaction,jobimprovementsandcareersteps.
Through all those moments, the teachers and students’ log files will be collected to evaluate the e-learning module and the VCP use. Furthermore, statistics will also derive from the online communitythroughwhichteacherswillbeabletocommunicateandinteract,inordertohaveadescriptiveaccountofparticipants’contributions.Table5summarizesthedatagatheringtechniques,informants,topicsandtimeswhentheinformationwillbecollectedaccordingtoFigure3and4.
8.2 Whowillcollectthedata?Therewillbeonepersonresponsibleforeachpresencelearningcourse(VittorioEmanueleII,OMNIAandthe Nurses School of UVA) in charge of collecting the data required using the evaluation instrumentsprovided in this document, according toa time schedule theevaluation coordinatorswill beproviding.The e-learning log files will be collected by GGallery, along with the VCP’s feeding. One responsiblepartnerfromGGalleryandSI4Lifewillbethereferencecontactforevaluationpurposesofthee-learningevaluationandtheVCPs.Eachpersoninchargeoftheevaluationineachcountrywillberesponsibleofprovidingthedatacollectedbytheevaluationinstruments,sotheevaluationcoordinatorscananalyseandwritethereportwiththefinal results. Besides, to facilitate the analysis, we will ask the partners to send the data required inEnglish.Furthermore,membersofAGEwillmakeatleasttwoobservationsofthepilots’enactmentandtheywilltransfertheresultstothecoordinatoroftheevaluationworkpackage.All the instruments showed in the appendix are an initial proposal. The primary structure will bemaintained,althoughtheinstrumentsmayberefinedoradaptedwiththehelpoftheinvolvedpartners’feedback,andoncethecontext’scharacteristicswheretheevaluationwillbetakenwillbebetterknown.Table 6 illustrates the responsible partners according to the different evaluation tasks identifiedwithinthisworkpackage.
One person responsible foreach presence learningcourse:
·-VE-II·-OMNIA·-NursesSchoolatUVa
-Collaboration in the definition and refinement of thedatagatheringinstrumentsavailableintheappendixofthisreport.-Translate the evaluation instruments into thecorrespondinglanguages(Spanish,ItalianandFinnish)-Completetheevaluationtoolkit-Sendthe information inEnglishtothecoordinatorsofthisWP.
On responsible person byAGEplatform
-Performat leastoneobservationof theenactmentofeachpilot-Sendthe information inEnglishtothecoordinatorsofthisWP.
One responsible person byOMNIA
-Identifyappropriateindicatorstoevaluatethetrainingmaterials from the pilots according to EQAVETframework.
e-learningcourses One responsible person byOMNIA
-Identifyappropriateindicatorstoevaluatethetrainingmaterials from the pilots according to EQAVETframework
One responsible person byGGalleryOne responsible person bySI4LIFE
-Identify useful indicators coming from the analyticsobtainedfromtheuseofthee-learningplatform.-Report the data coming from the e-learning platformanalyticstothecoordinatorofthisWP.
Virtual Community ofPractice(VCP)
One responsible person byGGallery.OneresponsiblebyUVa-Tech.One responsible person bySI4LIFE.
-Identifyuseful indicatorscoming fromtheanalyticsofthe VCP to assess the final user’s performance withinthisvirtualspace.-Report the data coming from the analysis performedbythecoordinatorofthisWP.
8.3 DraftoftheevaluationtoolkitinT.6.3The resources available for the evaluation will be the instructional designs and the trainingmaterials.Also,wewilllookatthefinaleducationalplatform,whereweexpecttocollectdataabouttheeducationaland pedagogic process, along with the pilots’ designs. The teacher’s actions and reflections, and thestudents’actionswillprovidedataandmaterialtoevaluatemeaningfulinputs.Themainelementsdesignedfortheevaluationkitare:
● InstructionaldesignoftheVETcourses.Thisdesignwillbeinformedmainlythroughtheteacher’scoursedesign,theanalysisofthetrainingmaterialsandthefinalteacher'sreflectionsaboutthedesign and the enactment of the pilot. Some aspects of this topic may be enriched by datarecorded for other topics, like the participation and students’ performance, the teachercompetenceandskillsandthequalityoftrainingmaterials.
Wewillgatherdatathroughstudentsandteacher’squestionnairesatthebeginningandtheendof theenactmentphase.Also,datawillbecompletedwith reflectionscollectedwhenanalysingthequalityoftrainingmaterialsandtheteachersandstudentslogfiles,regardingtheuseofthee-learning platform and the VCP. The courses’ activities and knowledge acquisition will beevaluatedthroughobservations,teacherinterviewsaswellasbytheemploymentoftheteacherreflectivediary.
● Teacher’s competencies and skills. This evaluation topic will be observed through courseobservations in the classroom, but also analysing the student’s participation and performance.Teacher interviews, final students’ questionnaire, classroom observation and teacher reflectivediarywillbethemaintoolsfordatacollectingforthisevaluationtopic.
9 QualityAssuranceandOverallProjectEvaluationAccording toRIGA conclusions (June, the22ndof 2015), andaspartof quality assurance systems, it isimportant to establish continuous information and feedback loops in VET systems based on learningoutcomes.ThatwouldincludetheuseofinformationonVETgraduateemployabilityandacombinationofdata on learning, labour market entry, and career. Moreover, the use of outcomes of VET should besystematically used as part of quality assurance systems. As the RIGA document states, to delivervocational curricula, would be necessary to track learners after completing their training, through theidentificationofadequatemeasures,inordertoprovide“feedbackloops”.Therehavebeenalready someexperiencesabout tracking students (e.g. thedevelopmentofGraduateTrackingSysteminCroatia,2013-2016.In2013,theCroatianMinistryofEducationbeganapilotprojectwith theaim to collectexistingdataongraduate students,employedandunemployedpersons). Thesetrackingcouldbedonecombiningadministrativedatawiththeresults,sometimeseasiesttoget,throughtargetedsurveys,feedbacksheetsorcourseevaluations,face-to-facecontactorfocusgroupinterviews.Trackingcanbecomplementedwithothermeasuressuchas interviews, focusgroupmeetings, studentfeedbacksheets,etc.Tracking Learners’ and Graduates’ Progression Paths (2012), states “Tracking should lead to theenhancementofteachingandlearning,andbelinkedtostudentservices.Relevantresultswouldalsobemade available to staff and students, prospective students, policy makers and employers” (pg. 51).Employment ratesofgraduateswithinaspecifiedperiodaftergraduationandstudent feedbackon thequalityofstudyprogrammesandfacilitiescouldprovidefeedbacktoimproveandconfirmthequalityofVETprograms.TheinformationcollectedfromthedifferenttrackingactivitiesdesignedwithintheCARESSVirtual Community of Practice could be combined and used in this “feedback loop”. Thatwill allowussystematicallymonitorthestudentslearningoutcomesanditseffectiveness.
Thismonitoringwill take place during the early period after the conclusion of the Pilots and over thetargeted students, during a period of six months. Those students will be tracked using the e-learningplatformandVCPdata,togetherwithstudents’interviewsandquestionnaires.Thefollowinganalysiswillallowustoprovidetheprojectwiththisfeedbacktocontributetotheimprovementofthequalityofthehomecareprogram.
● Cedefop(2015).Europeanguidelinesforvalidatingnon-formalandinformallearning.Availableathttp://www.cedefop.europa.eu/en/publications-and-resources/publications/4054 . Last visited2017/04/27.
● Charters, E. (2003). The use of think-aloudmethods in qualitative research an introduction tothink-aloudmethods.BrockEducationJournal,12(2),68-82.
● Gaebel,M.,Hauschildt, K.,Mühleck, K. and Smidt,H. (2012) Tracking Learners’ andGraduates’Progression Paths. TRACKIT. European University Association. Available athttp://www.eua.be/Libraries/publications-homepage-list/EUA_Trackit_web.pdf?sfvrsn=2 Lastvisited2017/05/10.
● Greene, J.C.,Benjamin, L.,&Goodyear, L. (2001). Themeritsofmixingmethods inevaluation,7(1),25-44.
● Widdershoven,G.A.M.,&Sohl,C.(1999).Interpretation,actionandcommunication:Fourstoriesabout a supported employment program. In T. A. Abma, Telling tales. On narrative andevaluation.AdvancesinProgramEvaluation(6)(pp.109–130).Connecticut:JAIPress,109–130.
● RIGA conclusions, on a new set ofmedium-termdeliverables in the field ofVET for theperiod2015-2020, as a result of the review of short-term deliverables defined in the 2010 BrugesCommuniqué. Available athttp://ec.europa.eu/dgs/education_culture/repository/education/policy/vocational-policy/doc/2015-riga-conclusions_en.pdf.Lastvisited2017/05/10.
Instructions: This is an open-ended tool. The observer should take notes regarding the activity that isimplementedduring the lesson. Someof the areasonwhich theobserver should focus arementionedbelow.Feelfreetomodifytheseareasaccordingtotheprogressofthelesson/activity.(a)Abouttheteacher
11.4 Annex 4: A questionnaire to evaluate the training materials bythestudents(Draft)
Student/Evaluator:_________________________________Date:_____________________________________________Instructions: Grade the items below on a scale from 1-5. To what degree does the training materialsdemonstrate the following? Please, take in account the training materials regarding the in-persontraining,e-learningandVirtualCommunitiesofPractice.Q:Towhatextentdoyouagreewiththefollowingcriteriaregardingtheproductofthepilotcourse? Criteria Completely
11.5 Annex 5: Questionnaire to evaluate the training materials byTeachers(Draft)
Teacher/Evaluator:__________________________________ Date:______________________________________________ Instructions: Grade the items below on a scale from 1-5. To what degree does the training materialsdemonstrate the following? Please, take in account the training materials regarding the in-persontraining,e-learningandVirtualCommunitiesofPractice. Q:Towhatextentdoyouagreewiththefollowingcriteriaregardingtheproductoftheworkshop?
11.11 Annex11:Student’posttrainingquestionnaire(Draft)Instructions: Please complete this evaluation questionnaire by responding to the topics below. YourfeedbackwillassistusinevaluatingthequalityoftheCARESSpilotcourse.ThisinformationwillbekeptconfidentialandusedonlyforthepurposeofevaluatingtheCARESSpilotcourse.
Some examples of participation reports are Forum View: (View Discussion, Search, Forum, Forums,Subscribers)andForumPost:(AddDiscussion,AddPost,DeleteDiscussion,DeletePost,MoveDiscussion,PrunePost,UpdatePost).
Anextremelyuseful featureof theparticipation report is theoption to sendamessage toall studentswho have not completed a certain action. Other options with selected users are to add a note or toextendenrollment.
• STATISTICS
If statistics are enabled for the site, graphs and tables for the user activity may be generated. Thestatisticsgraphsandtablesshowhowmanyhitstherehavebeenonvariouspartsofthesiteatadifferenttime.Theydonot showthenumberofusers though.Theyareprocesseddailyat thetimeyouspecify.Youmustenablestatisticsinordertoseeall.
ThegraderreportcollectsitemsthathavebeengradedandcomefromthevariouspartsofMoodlethathavebeenassessedandallowsyoutoviewandchangethemaswellassortthemoutintocategoriesandcalculate totals in variousways.When you add an assessed item in aMoodle course, the grade bookautomatically creates space for the grades itwill produce and also adds the grades themselves by thetimetheyaregenerated,eitherbythesystemorbyyou.
FromMoodle 2.4.2 onwards, the default number of students per page (grade report students perpage)cansafelybeincreasedwithoutanydatalosswhengradinglargenumbersofstudents.
The following additional grade settings can be found in Settings > Site administration > Grades >Generalsettings.Thesettingswillaffectallgradebooksusedbyallteachers.
If this is checked by default, then all custom scaleswill, therefore, have a corresponding value that isassignedbasedonthenumberofoptionsinthescale(i.e.threevalueswillhavevaluesof1,2,3)andthiswillbeincludedinthegradeaggregation.
Grade publishingmay be enabled by checking the grade publishing box. If enabled, administrators areprovidedwithgradeexportpublishingsettingsineachcoursegradebook.
• Recovergradesdefault
Whenmanually enrolling a student in a course, there is an option called 'Recover user's old grades ifpossible',butthischeckboxiseasytomiss.Enabling'Recovergradesdefault'resultsin'Recoveruser'soldgradesifpossible'beingtickedforeverycourse.
• Unlimitedgrades
Teachers can enter grades over 100% directly in the grade book if the unlimited grades setting areenabled.
Gradescanbedistributedintogradecategories.Agradecategoryhasitsownaggregatedgrade,whichiscalculatedfromitsgradeitems.Thereisnolimittothelevelofnestingofcategories.Thatbeingsaid,eachgradeitemmaybelongtoonlyonecategory.Also,allgradeitemsandcategoriesbelongtoatleastone,permanent category: the course category. A grade calculation is a formula used to determine grades,based)onothergradeitems.NotethatthisisnotthesameasCalculatedquestiontypes.
Calculations for the grade book follow the pattern of formulas/functions in popular spreadsheetprograms.Theystartwithanequal (=)signandusecommonmathematicaloperatorsand functions. toproduceasinglenumericaloutput.Thisoutputisthenusedasthecomputedvalueforthegradeitemyouareediting.
Outcomes are specific descriptions of what a student has demonstrated and understood at thecompletionofanactivityorcourse.Eachoutcomeisratedbysomesortofscale.
Basically, outcomes are similar to sub-components of a grade. A grade is an assessment of the overallperformanceandittmayincludetests,participation,attendance,andprojects.Outcomesassessspecificknowledgelevelsthroughaseriesofstatements,whichmaybecodedwithnumbersorletters.Thus,anoverallgradecanbegivenforacoursealongwithstatementsaboutspecificcompetenciesintheformofoutcomes.
Whenatutorrealizesthatauseriseithernotveryactiveorisonthevergeofabandoningthecoursehecan take action by stimulating the user through a series of utilities on the platform itself. The Surveymodule is a course activity that provides a number of verified survey instruments, including COLLES(Constructivist On-Line Learning Environment Survey) and ATTLS (Attitudes to Thinking and LearningSurvey),whichhavebeenfoundusefulinassessingandstimulatingtheonlinelearningprocess.Teacherscanusethesetogatherdatafromtheirstudentsthatwillhelpthemlearnabouttheirclassandreflectontheirownteaching.Currently,Moodleonlyoffersspecifictypesofsurveys(futureversionswillenableyoutocreateyourown).
The available surveys have been chosen as being particularly useful to evaluate online learningenvironmentsthatuseaconstructivistpedagogy.Theyareusefulto identifycertaintrendsthatmaybespreadingamongparticipants.
AnewknowledgeisasocialConstructivism,whichportraysthelearnerasanactiveconceptualizedwithinasocially interactivelearningenvironmentandit isactuallybasedonunderpinningthedynamicviewoflearning. Social Constructivism is an epistemology or a way of knowing where learners work togetherreflectivelytoco-buildnewunderstandings,especiallywhenitcomestothemutualinquirygroundedintheirpersonalexperience.
Essentialtothiscollaborationisthedevelopmentofthestudents'communicativecompetence,thatistheability to engage in an open and critical discourse with both teacher and peers. This discourse ischaracterized by an empathic orientation to build a reciprocal understanding and a critical attitudetowardsexaminingunderlyingassumptions.
COLLES has been designed to enable you tomonitor the extent to which you are able to exploit theinteractivecapacityoftheWorldWideWebtoengagethestudentsindynamiclearningpractices.
• ATTLS-AttitudestoThinkingandLearningSurvey
The theory of 'ways of knowing', originally from the field of gender research (Belenky et al., 1986)providesuswithasurveytooltoexaminethequalityofdiscoursewithinacollaborativeenvironment.
People with higher CK scores tend to find learning more enjoyable and are often more cooperative,congenialandmorewillingtobuildontheideasofothers,whilethosewithhigherSKscorestendtotakeamorecriticalandargumentativestancetolearning.
Studies have shown that these two learning styles are independentof eachother (Galotti et al., 1999;Galottietal.,2001).Additionally,theyareonlyareflectionoflearningattitudes,notlearningcapacitiesorintellectualpower.