5. FIRE
• Future Internet Research and Experimentation
• Included as “FIRE” in EU FP7 and as “FIRE+” in EU H2020
• H2020 Future Internet:
• “moving towards a hyper connected world with hundreds of billions of devices fuelled
by ambient and pervasive services […]
• “supported by the early availability of testbeds for experiments and research
validation (FIRE+)”
• http://ec.europa.eu/programmes/horizon2020/en/h2020-section/future-internet
• A FIRE facility/testbed offers a number of resources (computing nodes,
tools, networking nodes, wireless spectrum analyzers etc) for experimenters
to use remotely
• http://www.ict-fire.eu
• https://www.youtube.com/watch?v=YlTSyn5iHCU
10. Fed4FIRE project
• Federation for FIRE
• FP7 IP project, 10/2012 - 9/2016
• project coordinated by iMinds
• Total budget: 7.75 MEUR
• www.fed4fire.eu
11. Fed4FIRE goal
• a common federation framework for FIRE facilities that will
• be widely adopted by different communities
• support powerful experiment lifecycle management
• support key aspects of trustworthiness
• http://doc.fed4fire.eu/
12. Example of the experiment lifecycle
Resource
discovery
• Show me
all
resources
available in
the
Fed4FIRE
federation
Resource
requirements
• Limit to
nodes that
have 2
IEEE
802.11n
interfaces
Resource
reservation
• Reserve
me 30
nodes on
testbed X
tomorrow
from 9-17h
Resource
provisioning
• Make sure
that they
will be
deployed
with Ubuntu
12.04 LTS
•After 10 s, start data stream of 10 Mbps with source node 1, after 30 s start second
data stream of 5 Mbps with source node 5.Experiment control
•Facility monitoring: crucial servers up and running? testbed up and running
•Infrastructure monitoring: CPU load, number of transmit errors
•Experiment measurement: measure end-to-end throughput, delay and jitter.
Monitoring
•Store measurements on the storage server of testbed X for later analysisPermanent storage
•I’m done with them at 15h already, release my resources so they can be used by
other experimenters.Resource release
16. • Web first approach allows publishing to virtually any platform
• Modern web browser
• FORGEBox
• Any LMS supporting iframes
• Apple iBook
• EPUB3
Cross platform
23. “Using the iMinds testbeds during the
exercise(s) improved my learning experience.”
24. Open positive feedback
• “Not overly focussed on knowledge by heart, more on the
understanding and practical use of theory”
• “Everything was easy to execute, it was all about
understanding”
• “No configuration hassle, nice and easy graphs”
• “No struggling with configuring systems Because of this, the
assistants had more time to extensively answer the questions
that we had.”
25. Deployment of the iMinds course
One machine controlling
the experiments
Three wireless nodes
performing the experiments
http://forge.test.iminds.be/wlan/
e-Learning:
an interactive Wi-Fi course
Automatic provisioning
Graceful degradation
IPv4 to IPv6 gateway
29. W-iLab.t Zwijnaarde
“isolated” environment
• external interference more limited than in office environment
• no human presence
60 fixed node
locations
15 mobile node
carriers
+ data/power points
for additional HW
31. Easy access
• Extensive documentation: http://doc.fed4fire.eu/
• Account for all testbeds via https://authority.ilabt.iminds.be
32. Easy access
• jFed tool for experiment design and access
• Transparent IPv6 access (no VPN, no tunnel)
• Transparent firewall work around
33. How to operate the testbed?
• Can be operated fully remotely
• Typical way of working:
1. reserve a slot for testing [ + indicate what hardware will be used ]
2. “swap in” test/experiment after reservation slot starts
• i.e. configure all nodes according to experiment description
3. during experiment, trigger certain manual or automated events (e.g.
“send traffic”, switch on/off node, change config. parameters…) and log
relevant data
4. after experiment ends, “swap out” experiment
34. How to operate the testbed?
• Reservation always needed for wireless testbed
• http://wilab2.ilabt.iminds.be/reservation/ (separate account needed)
• https://www.wilab2.ilabt.iminds.be:12369/reservation/sfareservation.php3
(import PKCS certificate into your browser)
39. OMF: Experiment description
• OEDL language (ruby)
• http://mytestbed.net/projects/omf6/wiki
• Uniform description of experiment
• Which resources?
• Which applications?
• Entire flow of the experiment (timeline)
• Easy to map to other testbeds
27/04/201544
40. OMF: Experiment Controller
• Executes an OMF OEDL Experiment Description
• omf exec wlan1-1.rb
• Testbeds can provide an EC
• Or you can run your own
• Fully automated execution
• Ties experiment to an OML server
42. OML: application hooks
• OML measurements points can be defined in custom source
code (C library)
• Some applications provided by NICTA: iperf, network test apps
• Custom wrapper scripts can be created to capture output of
any application, format it to OML compliant syntax
43. iMinds course flow
• Reserve 3 wireless nodes necessary for experimentation
• Provision machines
• 3 selected nodes on w-iLab.t (hostapd and wifi configured via scripts)
• 1 generic node on Virtual Wall with custom image (course page, OMF EC
and OML)
• Experiment control initiated through course page
• Start OMF Experiment Controller with correct Experiment Description
• Visualize OMF EC status on course page using status widget
• Results are automatically collected using OML
• Visualize OML results on course page using graph widget
47. 52
FORGE Project FP7-ICT-610889
Use of schema.org/LRMI
• Included in FORGEBox implementation
when preparing a course
Google Structured Data Testing Tool
48.
49.
50. Opening the FORGE platform
• Build your own interactive course for free!
• Exploit FIRE facilities for educational purposes
• Deploy courses on FORGE platform or other educational platform
51. Target Groups - participants
• Educational institutions
• Research institutions
• Small/ medium/ large enterprises
• Anyone interested in bringing the worlds of FIRE and eLearning
together
52. Open Call details
High visibility
Guided training &
technical assistance
• Simple application form
• Lean MoU
Continuous open access
to FORGE tools
53. Types of proposals
Educator/ Learner
• Deploy an existing
FORGE course (as-is)
• Transform a traditional
course into an
experiment-driven one
• Design a new FORGE
lab course
• Extend an existing
FORGE course
• Create interactive
educational material
(eBook)
Developer
• Develop new widgets
and/or FIRE adapters
Provider
• Offer a testbed for
remote
experimentation
+ any combination of the above!
54. Timeline
* and decision on consecutive phases
31st
March
2015
15th
May
2015
Jun’15
by
Sep’15 Oct’15
by
Sep’16
Submission of proposals is possible after the
cut-off date, but they will be taken into
account for the next round of proposals’
submission and implementation
55. How to apply
Part A
•Profile
•Expertise
•Contact
Part B
•Description
of Work
Submission
• in English
• opencall@ict-
forge.eu
Very simple application form Submission in a single stage
56. Prioritization of proposals
Criteria
Reach/
dissemination
Feasibility Sustainability
Interactive
elements
Use of FIRE
facilities
Foreseen
support
Deployment
within a real-
time course
No scoring!
Criteria are used for ranking proposals into a priority list
in terms of implementation and provided support.
Created/operated/maintained via research projects , EU funding
17 core partners
4 year duration
Resource discovery: Finding available resources across all facilities, and acquiring the necessary information to match required specifications.
Resource requirements: Specification of the resources required during the experiment, including compute, network, storage and software libraries.
Resource reservation: Allocation of a time slot in which exclusive access and control of particular resources is granted.
Resource provisioning
Direct (API): Instantiation of specific resources directly through the facility API, being the responsibility of the experimenter to select individual resources.
Orchestrated: Instantiation of resources through a functional component, which automatically chooses resources that best fit the experimenter’s requirements.
Experiment control: Control of resource behavior during experiment execution, involving actions to query and modify resource state, and their correct sequencing.
Monitoring
Facility monitoring: Instrumentation of resources to supervise the behavior and performance of facilities allow system administrators or first level support operators to verify that facilities are performing correctly.
Infrastructure monitoring: Instrumentation of resources to collect data on the behavior and performance of services, technologies, and protocols to obtain measurements in the context of a concrete experiment.
Experiment measuring: Collection of experiment data generated by frameworks or services that the experimenter can deploy on its own.
Permanent storage: Storage of experiment related information beyond the experiment lifetime, such as experiment description, disk images and measurements.
Resource release: Release of experiment resources after deletion or expiration the experiment.
In this example an experimenter has developed a mechanism to automatically create a wifi mesh network (multi-hop network). The experimenter wants to test this at a larger scale, hoping to proof that the new solution can easily forward multiple streams at the same time without sacrificing any performance.
As mentioned in the previous slide, there are 3 types of experimenter tools supported, and for each of them the experimenter has several options to choose from because we adopted 3 APIs in the Fed4FIRE architecture.
For resource discovery, reservation and provisioning, the experimenter can choose to use the Fed4FIRE portal, Flack, Omni and SFI. All these tools use the SFA interface to talk to the specific testbed management components (called Aggregate Managers).
For Experiment control, the experimenter can run its own instance of an experiment controller. This can be the OMF6 Experiment Controller, or NEPI. The user can also directly log in on the nodes using its local SSH client, and perform experiment control actions manually using the console on the resource.
For measurement and monitoring, the experimenter can use several OML tools: filters, persistence tools (store in SQL database) and visualization tools. In case of experiment measurement, these tools will use OML streams that are directly originating from the OML measurement library component that is deployed on the resource. For facility and infrastructure monitoring, the testbed will wrap its existing monitoring infrastructure (being Nagios, Zabbix or Collectd) in an OML stream, which will be used by the OML experimenter tools.
To support all these tools, it is required that testbeds expose their management software through the SFA interface, that they deploy OMF6 experiment contorl on their testbed, and that they provide one of the mentioned monitoring frameworks and wrap its output in OML streams. On the resources three agents have to be deployed: an OMF resource controller, an OML measurement library and the appropriate monitoring agent that corresponds with their adopted monitor framework.
Demo the course
Verwijzen naar LMS voor barca vrouw
Experiment on wilab for actual wireless nodes + web interface also as an experiment on other facility to make it all dynamic
As mentioned in the previous slide, there are 3 types of experimenter tools supported, and for each of them the experimenter has several options to choose from because we adopted 3 APIs in the Fed4FIRE architecture.
For resource discovery, reservation and provisioning, the experimenter can choose to use the Fed4FIRE portal, Flack, Omni and SFI. All these tools use the SFA interface to talk to the specific testbed management components (called Aggregate Managers).
For Experiment control, the experimenter can run its own instance of an experiment controller. This can be the OMF6 Experiment Controller, or NEPI. The user can also directly log in on the nodes using its local SSH client, and perform experiment control actions manually using the console on the resource.
For measurement and monitoring, the experimenter can use several OML tools: filters, persistence tools (store in SQL database) and visualization tools. In case of experiment measurement, these tools will use OML streams that are directly originating from the OML measurement library component that is deployed on the resource. For facility and infrastructure monitoring, the testbed will wrap its existing monitoring infrastructure (being Nagios, Zabbix or Collectd) in an OML stream, which will be used by the OML experimenter tools.
To support all these tools, it is required that testbeds expose their management software through the SFA interface, that they deploy OMF6 experiment contorl on their testbed, and that they provide one of the mentioned monitoring frameworks and wrap its output in OML streams. On the resources three agents have to be deployed: an OMF resource controller, an OML measurement library and the appropriate monitoring agent that corresponds with their adopted monitor framework.
Demo jfed tool
The problem is that there are different testbeds with each their management software,
Experiment description : FULL description of entire configuration + FLOW of experiment + measurements definition -> Easier to re-run experiments under different environment + show scientific relevance by including entire description of the experiment + easier to compare results!
Through this open call the project opens up the FORGE platform to volunteers from the learning community and elsewhere, who want to exploit FIRE facilities for educational purposes and contribute to the FORGE ecosystem.
Within the call the project solicits proposals for the development (or use) of experiment-driven courses and offers the chance to educational institutions to include these innovative courses to their curricula.
FORGE is seeking contributions and new developments of educational material within the FORGE platform and tools.
The expected types of participants span among educational or research institutions, universities, individuals, small/medium/large enterprises, and in general anyone interested in bringing the worlds of FIRE and eLearning together.
All proposals shall be eligible and may be accepted, provided that they foresee the use of FORGE tools and FIRE facilities within the implementation of the suggested courses.
The FORGE project launches an Open Call as continuous open access to the FORGE tools, meaning that no strict procedures for proposal submission and evaluation will be applied. The call offers free access to the FORGE tools and processes.
The open call will apply a fast submission process based on a simple proposal template. In addition, the collaboration of successful participants with the FORGE project will be formalised with a lean Memorandum of Understanding (MoU); therefore the administrative burden for proposers is limited, since there will be no Grant Agreement, no Consortium Agreement and no official deliverables.
The FORGE consortium will provide support to the interested users throughout the development of their courses, covering guided training and technical assistance.
Moreover, the participants will be acknowledged as associate partners and will receive wide recognition and high visibility through the project’s website.
Participants from successful proposals within this call will not receive EC funding and will not become official partners in the FORGE project.
What types of proposals are expected? Is the call addressed only to developers?
Whether you are an educator, learner, developer or provider, many different types of proposals may be submitted for the FORGE Open Call, an indicative, non-exhaustive list of which is presented hereafter:
Deployment (as-is) of an existing FORGE course (either one of the prototype courses that have been developed by the FORGE consortium or an external course that will be developed within the Open Call process by an external user) in the context of a real-time course offered within the curriculum of the proposing institution(s);
Transformation of a traditional course to an experiment-driven course;
Design and development of a new lab course, following the FORGE methodology;
Further development and extension of a FORGE course;
Creation of interactive educational material (e.g. editing an existing course and creation of an eBook) based on the FORGE approach;
Development of new widgets and/or FIRE Adapters;
Offering of a testbed for remote experimentation;
Any combination of the above.
In order to be able to provide optimal support to the participants and facilitate the implementation of proposals, a cut off-date for submission of proposals is set after the announcement of the call, at 15th May 2015.
Depending on the interest expressed until then, the call may be extended into consecutive phases of submission and implementation of proposals. In that case, the stages of implementation of proposals and evaluation of work may be overlapping.
Participants will be able to submit their proposals at any time after the official announcement of the call. However, proposals submitted after the first cut-off date (or any cut-off dates that will be set for the consecutive submission phases) will be taken into account for the next round of proposals submission and implementation.
Following the open call announcement, all interested parties are welcome to submit their proposals for developing lessons and lab courses using FORGE tools together with FIRE facilities. Proposals will be submitted in a single stage, by completing a very simple application form that is available at the Open Call page of the project website (http://ict-forge.eu/opencall/).
The proposals must consist of two parts, Part A and Part B:
Part A provides the administrative information about the proposal and the applicants. The information requested includes characteristics of the applicants (organization profile and expertise) and contact details.
Part B is the description of the content of the proposed work. Applicants are advised to follow this structure when describing their proposals. The subsections included in this part intend to highlight those aspects that support the objective of the call and the goals of the project in general. Part B covers, among other things, the concept and objectives of the proposed work, the implementation details and the expected outcome and impact of the proposed work.
Proposals must be submitted in English, at the following address: opencall@ict-forge.eu
An acknowledgement of receipt will be emailed to the email address of the person identified as responsible in the submitted proposal as soon as possible after the FORGE consortium receives the proposal.
Participants will be able to submit their proposals at any time after the official announcement of the call. However, proposals submitted after the first cut-off date (or any cut-off dates that will be set for the consecutive submission phases) will be taken into account for the next round of proposals submission and implementation.
All proposals shall be eligible and may be accepted, provided that they foresee the use of FORGE tools and FIRE facilities within the implementation of the suggested courses.
However, some additional criteria will be also taken into account for ranking the proposals into a priority list in terms of implementation and provided support.
These criteria include:
the use of FIRE facilities,
the reach of the course and its dissemination aspect,
the feasibility and sustainability of the course,
the interactive elements it comprises,
as well as the foreseen provided support by the FORGE consortium.
In addition, proposals that foresee the deployment of the proposed course in the context of a real-time course offered within the curriculum of the proposers’ institutions will be given high priority.
However, no scoring will be applied, since the process is not competitive and the open call is not funded.
Support to participants prior submission:
The FORGE consortium will be available and will provide support to all interested parties for any discussion or clarification of any aspect of the call or their planned scenario of exploitation of the FORGE tools and processes throughout the duration of the call. This will cover guided training, both on administrative and technical issues, as well as technical assistance.
Potential proposers are strongly encouraged to discuss their ideas with the FORGE consortium prior to the submission of their proposal.
Support to participants during implementation:
The FORGE consortium will provide support to all participants and monitor the implementation of the lab courses. To that end, the consortium will provide the following to all interested parties participating to the open call for the development of their courses:
Online documentation and supporting material (such as videos with instructions on how to use the FORGE tools and platform and other technical issues). This documentation will be available at the Open Call page of the project website (http://ict-forge.eu/opencall/) throughout the duration of the call and may be updated if necessary, according to the received inquiries.
A permanent channel for submission of inquiries by e-mail, to the following address: opencall@ict-forge.eu.
A Frequently Asked Questions (FAQ) section at the Open Call page of the project website (http://ict-forge.eu/opencall/), where the received inquiries and several clarifications will be published.
Νew users and contributors will be supported by the FORGE consortium. All cases will be carefully examined and the FORGE consortium will provide the required technical assistance to the best of its ability, especially regarding implementation of new APIs, FIRE adapters, eLearning widgets or Learning Management System (LMS) integration