LinkedIn emplea cookies para mejorar la funcionalidad y el rendimiento de nuestro sitio web, así como para ofrecer publicidad relevante. Si continúas navegando por ese sitio web, aceptas el uso de cookies. Consulta nuestras Condiciones de uso y nuestra Política de privacidad para más información.
LinkedIn emplea cookies para mejorar la funcionalidad y el rendimiento de nuestro sitio web, así como para ofrecer publicidad relevante. Si continúas navegando por ese sitio web, aceptas el uso de cookies. Consulta nuestra Política de privacidad y nuestras Condiciones de uso para más información.
Me à PhD Candidate and Information Officer at the Immersive Education initiative European chapter. My research focused on Intelligent Learning Environments, mixed reality and immersive education. This research is part of a collaborative project between Essex and King Abdulaziz Uni to scale-up existing technology in intelligent environments, from smaller facilities such as the University of Essex iSpace to a complete campus with multi-story large footprint buildings.
Survey conducted by The Economist magazine in 2008 with 289 executives responded the survey, 189 participants came from higher education and 100 came from corporate settings
A virtual learning environment (web/ 3D world) A physical learning environment (real world)
ejemplo Blended Reality: use of videoconferences (Skype) ejemplo Vacancy problem: everyday activities as run
But what happen with laboratory activities?
University of California Davis School of Medicine - Virtual Hallucinations Project: http://www.ucdmc.ucdavis.edu/welcome/features/20070404_virtual_psych/index.html Video (a bit disturbing): http://www.youtube.com/watch?v=s33Y5nI5Wbc Sakshat Amrita Virtual Lab (Amrita University) Project: http://amrita.vlab.co.in/ Video: http://www.youtube.com/watch?v=ViqHtlZSOjM Ohio University College of Osteopathic Medicine - Virtual Haptic Back Project for palpatory training http://www.ohio.edu/people/williar4/html/VHB/VHB.html University of Bristol - LabSkills http://www.bristol.ac.uk/news/2009/6154.html Software was developed by Bristol Uni and currently it is used in Durham Uni and Notthingham Uni (each uni gave a talk about the use of this software in their courses) http://www.labskills.co.uk/ University of Leeds - Virtual Labs http://www.virtual-labs.leeds.ac.uk/pres/index.php Freie Universitat Berlin - UniversitTechnology Enhanced Textbook (TET) http://didaktik.physik.fu-berlin.de/projekte/tet/ http://didaktik.physik.fu-berlin.de/IMPAL/show/demo.php Durham University - Interactive screen experiments - physics http://level1.physics.dur.ac.uk/general/index.php
The interaction between the elements can be defined as Cross-Reality (xReality) or dual reality
The implementation of the InterReality Portal is based on three major components: A real environment A 3d virtual environment xReality objects and Virtual objects
we combine 1) a semi-spherical sectioned screen that includes a desk allowing the user to be sit in a natural position to perform learning activities, with a free-range of head movement without the need of any intrusive body instrumentation, based on a specification from the sci-fi story “Tales from a Pod” 2) a camera that allows students to interact between them and the environment, 3) a network of sensors and actuators to obtain real-time information for object identification and replication for dual reality states.
Following the ideas of smart objects on the Internet-of-Things (IoT), an xReality object has a unique ID, a list of available services (e.g. to get data or receive data) and in some cases rules (e.g. a certain object cannot work without fulfilling some preconditions). In a similar way each virtual object has a unique ID, one or more behaviours attached (e.g. the virtual object must behave as a solid object according to physical variables, such as weight, gravity, etc.) and rules similarly to the xReality object
formed by a group of interchangeable pluggable components, to create a mashup with a main component which identifies and integrates the others
As long as the session continues, changes in any of the objects will be managed by the Context-Awareness agent and the Mixed Reality agent considering the following scenarios: A change in any Virtual object of a given InterReality Portal results in identical changes to all subscribing InterReality portals. A change in an xReality object of a given InterReality Portal results in changes in the representation of the real device on all subscribing InterReality portals.
Learners view: During the learning session when they interact with the environment and between them, they are programming a series of actions to be executed by the objects. Instructors view: While they are creating a UoL, they are establishing a sequence of activities to be performed by the learners during the educational session.
is to built computer science projects combining hardware (xReality objects) and software modules (virtual objects) to create Internet-of-Things applications emphasising computing fundamentals grounded on co-creative and collaborative interaction between learners using problem-based learning (PBL). A diversity of resources may be available to support learning however this process cannot be complete without… IMS = Instructional Management Systems (IMS) Global Learning Consortium BENEFITS à portability and reusability Problem-based learning is a constructionist method that allow students to construct their own knowledge by the correlation between concepts and proposed solutions to real world problems performed in realistic settings
From the instructors’ perspective à the creation of UoL is based on activities (atomic functions) which can be combined to create nuclear functions (a complete UoL). From the learners’ perspective à xReality and virtual objects (atomic functions) can be considered as part of a deconstructed set of components that students can reconstruct in any combination to generate their own unique xReality project – or to reconstruct a xReality project that is prescribed by the instructor – (nuclear functions). From a technical (system architecture viewpoint) à the deconstructed elements become sets of autonomous networked resources, that may be inter-connected to form different combinations (constructed) forming a variety of student projects or UoL as required by the teacher or student.
our research is moving from phase 1 to phase 2, integrating the InterReality Portal implementation with the end-user programming concepts discussed within this paper. The possibility of combining virtual and xReality objects with deconstructionism and the use of end-user programming to create mixed reality learning activities that can be constructed and shared by teams of geographically dispersed students is our final goal.
We presented a series of challenges for constructing this model
Towards blended reality on collaborative laboratory activities using smart objects
Towards blended reality oncollaborative laboratory activitiesusing smart objectsSchool of Computer Science & Electronic EngineeringUniversity of Essex, UKAnasol Peñ a-Ríosacpena@essex.ac.uk1
• Introduction• Blended learning landscapes• Scenario: Collaborative laboratory activities• Interreality environment (Conceptual model &implementation)• Blended reality distributed system• Collaboration projects• SummaryOverview of the presentation2
IntroductionThe Economist Intelligence Unit, “The future of higher education: How technology will shape learning,” The Economist, 2008.6
Use of technology in academic settings(Some challenges)Promote adoption of new technologiesadoption of new technologiesby academic faculty members and studentsAdapt technologies, pedagogicalAdapt technologies, pedagogicalmethods and evaluationsmethods and evaluations enabling thisintegrated solutions for people ingeographically dispersedgeographically dispersed locationsChange the way to experienceexperienceeducation from one-dimensionaleducation from one-dimensional(physical) to multi-dimensionalmulti-dimensional (physicaland virtual) education in an integrated wayBlendedBlendedLearningLearningLandscapesLandscapes7
Blended learning landscapesVirtualLearningEnvironmentBlendedLearningEnvironmentReal LearningEnvironmentAn infrastructurethat works as a linkbetween thisenvironments able toreflect information(data, changes, etc)on real timeLearning activities thatcomplements virtualand real world makingpossible for studentsand tutors to movebetween these two in aseamless way8
BlendedRealityThis might be limited by the capacity of user’s presence and engagement to a single realityat a time (Liftons "Vacancy problem"1) a consequence of the user’s real immersion.9Users interaction in real-time with two differentenvironments (real/localand virtual/distant) extendthem to work as if theywere one, by blendingtraces of one into the otherunconsciously (oftenseemingly as simultaneously).1. J. Lifton and J. Paradiso, “Dual Reality: Merging the Real and Virtual,” Lecture Notes of the Institute for Computer Sciences,Social Informatics and Telecommunications Engineering, vol. 33, no. 1, pp. 12-28, 2010
Scenario (Work in progress)COLLABORATIVELABORATORY ACTIVITIES10
Laboratory activities for distance learnersWhich are the options for remote laboratory activities?• Simulations• Virtual laboratories• Remote laboratories• To test conceptual knowledge• To work collaboratively• To interact with equipment• To performing analysis onexperimental dataLaboratoryactivities11
ExamplesUniversity of Bristol - LabSkillsUniversity of Leeds - Virtual LabsChallenges for current technology:•Limited interaction with real equipment•Use of idealised datasets•Restricted collaborative interactionFreie Universitat Berlin –Universit Technology Enhanced Textbook12
Interreality Environment1 - J. van Kokswijk, Hum@n: Telecoms and Internet as Interface to Interreality : a Search for Adaptive Technology andDefining Users, Bergboek, 2003A collection of interrelated devices in the real world, a virtualenvironment and software agents that allow users to complete activitiesat any point of Milgram’s Virtuality Continuum to achieve userperception of integration between the physical and the virtual world.13Interreality.- “a hybrid total experience between realityand virtuality”1
Interreality Portal (Definition)In the educational context can be used:• As a learning environment• For geographically dispersed students• Using mixed reality laboratory activities14Human-computer interface (HCI) which captures dataobtained in real-time, and processes this data soit can be mirrored in the virtual world linkingboth worlds.
A "real" immersiveenvironment•Gives to the learners thesensation of "being there"A 3D virtual environment•3D GUI made on Unity3D, across-platform game engineused to create interactive 3Dcontent.xReality objects1- Immersive Display Group (www.immersivedisplay.co.uk)2 - V. Callaghan, «Tales From a Pod» on Creative Science Workshop,Nothingham, 2010.1, 2ImmersaStationInterreality Portal (implementation)16
17Smart objects• coupled to their virtualrepresentation (dualreality state)• updated and maintainedin real time1.- Kortuem, F. Kawsar, D. Fitton and V. Sundramoorthy, “Smart objects as building blocks for the Internet ofthings,” IEEE Internet Computing, vol. 14, no. 1, pp. 44-51, 2010Autonomous physical/digital objectsaugmented with sensing, processing, andnetwork capabilities” which caninterpret their local situation andstatus, and can communicate with othersmart objects and interact with humanusers 1+Xrealityobjects=Smartobjects
Interaction between physical and virtualelements within an environment• Each object, (real and virtual), iscomplete by itself and can existwithout the other.• Dual reality state enrichesboth objects by the bi-directionalprocess that can reflect, influenceand merge real-time information.MIT Media Laboratory – Dual realitylab 200818CrossReality (xReality)
xReality objects & virtual objects• Real world objects that havea virtual representation• Dual reality statexReality objects Virtual objects• Only exist inside a virtual environment• Rules and behaviours but these dependon the existence of the virtual world.19Anasol Peñ a-Ríos, V. Callaghan, M. Gardner & M. J. Alhaddad, «Remote mixed reality collaborative laboratory activities: Learningactivities within the InterReality Portal»on Web Intelligence and Intelligent Agent Technology (WI-IAT), 2012 IEEE/WIC/ACMInternational Conferences - The Intelligent Campus International Symposium (IC’12), vol.3, no., pp.362,366, 4-7, Macau, China, 2012.
xReality objects (Implementation)* Fortito (www.fortito.mx)20**Raspberry Pi Foundation (www.raspberrypi.org)Main module Components• Fortito’s* Buzz-Board Educational Toolkit• 30 pluggable boards that caninterconnect and communicate betweeneach others via I2C protocol• Comprises different sensors andactuators to allow the creation of diversephysical mashups• Raspberry Pi **• Detects other components andworks as a hub to connectthem to the interreality system
Implementation (architecture) (single dualreality)22Anasol Peñ a-Rios, V. Callaghan, M. Gardner & M. J. Alhaddad, «Developing xReality objects for mixed-reality environments»on 1st Workshop on Cloud of Things (CoT13), Athens, Greece, 2013. Unplublished conference paper.
Implementation (single dual reality)23Anasol Peñ a-Rios, V. Callaghan, M. Gardner & M. J. Alhaddad, «Developing xReality objects for mixed-reality environments»on 1st Workshop on Cloud of Things (CoT13), Athens, Greece, 2013. Unplublished conference paper.
Multiple dual realities24• Management of multiple dual reality states• Addition of a collaborative layer to allow two or more learners indifferent geographical locations participate on the laboratory activity+Anasol Peñ a-Rios, V. Callaghan, M. Gardner & M. J. Alhaddad, «End-user programming & deconstrutionalism for collaborative mixedreality laboratory co-creative activities»on 2nd European Immersive Education Summit (EiED’12), Paris, France, 2012.
Blended reality distributed system(Conceptual model)25Anasol Peñ a-Rios, V. Callaghan, M. Gardner & M. J. Alhaddad, «Developing xReality objects for mixed-reality environments»on 1st Workshop on Cloud of Things (CoT13), Athens, Greece, 2013. Unplublished conference paper.
End-user programmingThe interaction between users within collaborative mixedreality learning activities can be analysed from twodifferent angles:•From the learners view•From the instructors view.The learning environment should allow users to createand execute learning activities regardless their expertiseon computers.26
• Homogeneity & structure tothe creation of mixed realitylearning activities• Adding collaborativelearning to the use ofxReality objectsLearner’s viewInstructor’s view3DLearningenvironmentEnd-user programming27
Learning ActivitiesLearning activities within the InterReality Portal arestructured as a sequence of activities based on the IMSLearning Design specification (Units of Learning-UoL).CollaborativeCollaborativeinteractioninteractionbetweenbetweenlearnerslearnersEmphasiseEmphasiseComputingComputingfundamentalsfundamentalsProblem-Problem-basedbasedlearninglearningVirtualVirtualobjectsobjects(SW)(SW)Internet-of-Internet-of-ThingsThingsapplicationsapplicationsxRealityxRealityobjectsobjects(HW)(HW)ComputerComputersciencescienceprojectsprojects28
Deconstructed model• Based on the disaggregation of physical/logicaldevices and services.• Propose the creation/identification of a number ofelementary services (atomic functions) which canbe combined in various ways to create complexfunctions (nuclear functions).29
DeconstructionismRole Atomic function Nuclear functionLearner Objects available in theenvironment, actionsavailable (programmingstatements).An Internet-of-ThingsprojectInstructor Resources available inthe environment,activities available(sequence of activities).A Unit of Learning(UoL)Technical Infrastructure Processes, threads,processors or FPGAxReality toolkit andsystem30Anasol Peñ a-Rios, V. Callaghan, M. Gardner & M. J. Alhaddad, «End-user programming & deconstrutionalism for collaborativemixed reality laboratory co-creative activities»on 2nd European Immersive Education Summit (EiED’12), Paris, France, 2012.
Implementation phases31Anasol Peñ a-Rios, V. Callaghan, M. Gardner & M. J. Alhaddad, «Towards the Next Generation of Learning Environments: AnInterReality Learning Portal and Model»on 8th International Conference on Intelligent Environments 2012 (IE12), Guanajuato,Mexico, 2012.
Scale Up project• A collaboration between King Abdulaziz University, KSA andthe University of Essex,UK.32
Summary• Core elements of our mixed reality activities• Combine them using end-user programmingto create an educational mixed reality object• Immersive mixed reality learning environment• Grounded on collaborative, constructionalist,problem-based learning theories• Offers the possibility of creating collaborativelaboratory activities for geographicallydispersed studentsInterRealityPortalxReality &Virtual objects34
ChallengesPedagogical challenge: a constructionist student-centredmethod to create laboratory activities for distancelearners• A representation ofa deconstructedworldTechnicalchallenges• Learners• Instructors• A set of components thatcan be shared andcombined• Distribution of objectsbetween different immersiveenvironments• Blendedrealitydistributedsystem35
Challenges (2)• To solve this we have proposed the use ofa deconstructionist architecture.Physical model ofdistributed xRealityobjects in animmersive learningenvironmentPedagogicalmodel ofconstructionistlaboratoryactivitiesA solution forA solution fordistributeddistributedmixed realitymixed realitylaboratorieslaboratories36
Questions•Thank you for your email@example.com@ieee.org37