This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 693319. Disclaimer: This document reflects the views of the authors only and the Research Executive Agency (REA) is not responsible for any use that may be made of the information it contains. D2.10: Demonstrator applications testing and deployment Project acronym: Mobile-Age Project full title: Mobile-Age Grant agreement no.: 693319 Responsible: AUTH List of Authors: Michail Papamichail, Anastasios Kakouris, Manolis Falelakis (AUTH), Freddy Priyatna (UPM), Peter Shaw (SCC), Mikael Snaprud (TT), Frank Berker(FTB) Document Reference: D2.10 Dissemination Level: PU Version: 1.11 Date: 15/07/2019
95
Embed
Applicaiton testing and deployment - mobile-age.eu · Project acronym: Mobile-Age Project full title: Mobile-Age Grant agreement no.: 693319 Responsible: AUTH List of Authors: Michail
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 693319.
Disclaimer: This document reflects the views of the authors only and the Research Executive Agency (REA) is not responsible for any use that may be made of the information it contains.
D2.10: Demonstrator applications testing and deployment
Project acronym: Mobile-Age
Project full title: Mobile-Age
Grant agreement no.: 693319
Responsible: AUTH
List of Authors:
Michail Papamichail, Anastasios Kakouris, Manolis Falelakis (AUTH), Freddy Priyatna (UPM), Peter Shaw (SCC), Mikael Snaprud (TT), Frank Berker(FTB)
D2.10 – Demonstrator applications testing and deployment
5 | P a g e
List of figures
Figure 1 - Mobile Age Work Package structure and interrelationships ............................... 11 Figure 2 - Relationship with other deliverables .................................................................. 12 Figure 3 - Overview of the Mobile Age users and stakeholders in co-creation of open data-based public services ........................................................................................................ 16 Figure 4 - MADE users and end users ................................................................................ 17 Figure 5 - CIDER Users ...................................................................................................... 17 Figure 6 - The conceptual architecture of the Mobile Age Deployment Platform (MADE) .... 20 Figure 7 - Overview of the CIDER architecture ................................................................... 23 Figure 8 - Data model for co-creation activities ................................................................. 24 Figure 9 - High-level architecture diagram ........................................................................ 26 Figure 10 - Architecture Overview Diagram ....................................................................... 27 Figure 11: Successful Sign Up ............................................................................................ 29 Figure 12: Unsuccessful Sign Up ........................................................................................ 30 Figure 13: Unsuccessful Sign In ......................................................................................... 31 Figure 14: Successful Sign In ............................................................................................. 31 Figure 15: Unsuccessful Application Creation .................................................................... 32 Figure 16: Successful Application Creation ....................................................................... 33 Figure 17: Development Environment Dashboard.............................................................. 34 Figure 18: Error in creation of Development Environment .................................................. 34 Figure 19: Git Local Repository ......................................................................................... 35 Figure 20: Successful Synchronization ............................................................................... 36 Figure 21: Error in Synchronization ................................................................................... 37 Figure 22: Successful installation of Dependencies ............................................................ 38 Figure 23: Error in Dependencies installation .................................................................... 38 Figure 24: Successful Deployment of application ............................................................... 39 Figure 25: Error in application’s deployment ..................................................................... 40 Figure 26: Successful termination of application ............................................................... 41 Figure 27: Error in the termination of application .............................................................. 41 Figure 28: Successful start of application's container ......................................................... 42 Figure 29: Unsuccessful start of application's container ..................................................... 43 Figure 30: Successful stop of application's container ......................................................... 44 Figure 31: Error in termination of application's container .................................................. 44 Figure 32: Login page with valid credentials...................................................................... 52 Figure 33: Dashboard after successful login with valid credentials ..................................... 53 Figure 34: Result if invalid details are entered ................................................................... 53 Figure 35: Request made using ‘Postman’ showing the form data, and JSON response from the server ........................................................................................................................ 54 Figure 36: Request made using ‘Postman’ showing the form data, and JSON response from the server but with no authentication ............................................................................... 55 Figure 37: Analytics Dashboard - Showing the reports for a particular TID of a registered/logged in user ................................................................................................. 56 Figure 38: Dashboard landing page for login or manual TID entry ..................................... 57 Figure 39: Analytics Dashboard - Showing the reports for a particular TID entered manually ........................................................................................................................................ 57 Figure 40: JSON result after an authenticated user requesting their TID ............................. 58 Figure 41: Response to POST with a valid event data ......................................................... 59 Figure 42: Response to POST with invalid event data (includes which fields were invalid) ... 59
D2.10 – Demonstrator applications testing and deployment
9 | P a g e
Executive summary
This document reports on work carried out within the context of Mobile-Age Work Package 2 (WP2), which aimed at the design and implementation of the Open Senior Citizen Public Service Engagement Platform (OSCPSEP), which comprises two parts: (i) The Mobile Age Deployment Environment (MADE) and (ii) the Co-creation Information Documentation EnviRonment (CIDER).
Although, as suggested by its title, the initial purpose of this deliverable was to report on the testing of the demonstrator applications, the content has slightly changed to focus more on testing the platform components, but also include functionality tests of the applications. This was mainly because testing constitutes a very important part of the development of the platform, specified in the Grant Agreement of the project and, to our judgement, this is the most appropriate container for reporting it. Moreover, the demonstrator application testing was reported in detail in deliverable D3.6 - Evaluation Report (a) Bremen (b) South Lakeland (c) Zaragoza (d) Central Macedonia. In that document the reader can find evaluation from multiple perspectives; functionality, accessibility and usability.
Consequently, we report on the tests carried out for MADE (including both its PaaS and SaaS components), CIDER and the functionality tests of demonstrator applications. The deliverable contains brief description of all these components, the presentation of a test reporting template, before presenting the actual test results using this template.
D2.10 – Demonstrator applications testing and deployment
10 | P a g e
1 Introduction
1.1 Placement and objectives
Mobile Age aims to explore and implement innovative ways to support senior citizens to access and use public services through personalized mobile technologies based on open government data. In pursuit of this goal, the Open Senior Citizen Public Services Engagement Platform (OSCPSEP) has been developed to provide innovative tools and services that aim to facilitate the exploitation of open data and services that target senior citizens all across Europe. In particular, the OSCPSEP aims to provide the infrastructure and the corresponding tools necessary for the development of mobile and web-based applications that target senior citizens and functionality that relies on the use of open data and services. This functionality is supported by the MADE (Mobile Age Deployment Environment) component of the OSCPSEP. In addition, OSCPSEP aims to encourage the idea of publishing open datasets by providing an appropriate environment and tools for open data providers that enable them to publish their datasets such that they are widely accessible and thus easily exploitable. Finally, and in response to the reviewers’ feedback, we have developed the CIDER (Co-creation Information Documentation EnviRonment) component in order to host the data produced in the co-creation processes and facilitate them by offering flexible ways of retrieving and combining them. CIDER is now integrated with the co-creation guidebook, providing an interactive environment aiming to facilitate co-creation projects in all of their stages, i.e. planning, carrying out and learning/reflecting from them.
Work Package 2 relates to the design and development process of all the OSCPSEP components. The objectives of Work Package 2 are as follows:
• Identify generic user-oriented and system-related, functional, as well as non-functional requirements for the implementation of the system modules.
• Provide technical specification and design the overall architecture of the OSCPSEP infrastructure.
• Define the guidelines for the integration of modules.
• Stress test and integrate all modules developed.
• Implement an early OSCPSEP platform release and develop the final OSCPSEP platform.
• Test and deploy the pilot case applications.
The goal of this document is to present the technical tests carried out to ensure proper functionality of MADE, CIDER and the demonstrator applications.
1.2 Scope and relationship with other tasks
As illustrated in Figure 1, Work Package 2 consists of five tasks and is devoted to the development of the Open Senior Citizen Public Service Engagement Platform. The OSCPSEP provides the cloud-based technical foundation for the development of 3rd party mobile and web-based public services that target senior citizens. It involves Platform-as-a-Service (PaaS) and Software-as-a-Service (SaaS) capabilities and its overall goals are summarized below:
• Provide developers (public service providers and third parties) the environment (runtime, database management, preconfigured services and development tools) to develop, deploy and run applications by means of a PaaS offering.
D2.10 – Demonstrator applications testing and deployment
11 | P a g e
• Minimize the integration burden for developers wishing to enhance existing digital web and mobile public services with OSCPSEP functionality. For that purpose, the platform will support a series of generic components (e.g. Unified Search, Behaviour Analytics) exposed through a collection REST-based APIs as SaaS.
• Provide support for storing and publishing a wide range of datasets including information about integrated public services, senior citizen profiles, behaviour and service interaction analytics.
Figure 1 - Mobile Age Work Package structure and interrelationships
This deliverable is a result of work performed mainly within Task 2.6 Overall System integration and technical testing of demonstrator applications and has strong links with a number of other documents, such as:
WP 6: Project managementT6.1 Project setup
T6.2 Project management
WP 5: Communication, dissemination, exploitationT5.1 Planning & coordination of communication and dissemination activities
T5.2 Implementation of communication and dissemination activities
T5.3 Transferability, sustainability and Business Plan
T5.4 Uptake of mobile solutions and demonstrator application by city information providers and public authorities
WP 2 Open Senior Citizen Public Service Engagement Platform Development (OSCPSEP)T2.1 OSCPSEP requirements and specifications
T2.2 OSCPSEP PaaS Infrastructure Development & Deployment
T2.3 OSCPSEP SaaS Components & API
Development
T2.4 Open Data Feeds Aggregation & Public Services’ Integration with OSCPSEP platform
T2.5 Behaviour Analytics & Service Workflow Engine Development
T2.6 Overall system integration and technicaltesting of demonstrator applications
WP 3 Mobile services co-creation activities & evaluationT3.1 Co-creation recruitment & engagement planning for trial sites
T3.2 Stakeholder ideation & engagementactivities in Germany
T3.3 Stakeholder ideation & engagementactivities in UK
T3.4 Stakeholder ideation & engagementactivities in Spain
T3.5 Stakeholder ideation & engagementactivities in Greece
T3.6 Mobile Service Evaluation
WP 4 Development of front-end components and demonstrator applications
T4.1 Technical requirements and specification of demonstrator applications
T4.2 Development of generic front-end components
T4.3 Design studies and mock-ups for co-creation purposes
T4.4 Development of mobile front-ends for demonstrator applications
WP 1 Studies & policy briefings on access, mobility and open data Task 1.1 Study on accessibility, mobilityand open data
Task 1.2 Study on co-creation practices
Task 1.3 Policy briefing on co-creation ofopen (mobile) public services
Task 1.4 Evaluation and impactassessment framework
Task 1.5 Co-creation Best Practice Guidebook
researches and informs evaluation
requirements(users)
Requirements (sustainability)
Requirements (front-end)OSCPSEP components
Requirements(users)
evaluation
Disseminates process innovation Disseminates technical innovation
D2.10 – Demonstrator applications testing and deployment
12 | P a g e
• D2.12 – Update to PaaS Infrastructure, which contains a detailed technical description of the PaaS components of MADE as well as CIDER.
• D2.11 – Update to D2.8 Final OSCPSEP release, is the software release that complements D2.12.
• D1.8 – Final Guidebook on the design and deployment of co-creation approaches. This is now integrated into CIDER platform.
• D3.2 to D3.5 – Senior Citizen Engagement Reports for Bremen, South Lakeland, Zaragoza and Central Macedonia respectively, will contain data that is stored in the CIDER platform component, while the latter will also be used to generate statistics and analytics for the aforementioned series of documents.
• D4.2 to D4.5 – Prototype Demonstrator applications at Bremen, South Lakeland, Zaragoza and Central Macedonia respectively, will present the applications built and deployed using the MADE platform component for each field site.
• D3.6 - Evaluation Report (a) Bremen (b) South Lakeland (c) Zaragoza (d) Central Macedonia. This contains a detailed evaluation of the demonstrator applications, and some of its content is also reported here for the sake of completeness.
Figure 2 provides a diagrammatic representation of the links of D2.10 to other deliverables in the project.
D2.10 – Demonstrator applications testing and deployment
14 | P a g e
1.5 Document Structure
The remainder of the document follows the unified structure adopted for all WP2 deliverables, to address the concerns of the reviewers and begins with Chapter 2 that provides a description of the Mobile Age ecosystem, categorising its users and stakeholders, continues with Chapter 3 that describes briefly describes the design of the OSCPSEP components and the demonstrator applications. In addition, in the end of Chapter 3 we present a template, which is used to report the testing results for each module. Chapter 4 presents these results, which include statistics and graphs from the testing of MADE, CIDER and demonstrator applications. Finally, Chapter 5 concludes the document.
D2.10 – Demonstrator applications testing and deployment
15 | P a g e
2 The Mobile Age ecosystem: Stakeholders and Users
Here we provide an overview of the roles of Mobile Age users. Please note that their roles may be overlapping.
Local/regional governments: These can be managing the co-creation activities, define features of the applications, and serve as experts for a specific service domain. In many cases, local governments are also the data owners.
Software developers: These can be independent developers or companies, or working for IT-departments in public authorities or civil society organisations such as the Open Knowledge Foundation. They develop the applications using the platform and they participate in the co-creation activities, adjusting the applications to accommodate for the participants’ requests and demonstrating the results in an iterative process.
Older adults: They are a key stakeholder of Mobile Age and the main users of the mobile applications being developed. They may participate in the core project group or engage in the broader co-creation activities.
Service providers such as government, social welfare organisations, religious congregations or NGOs may be part of the core project group or engaged for specific input. Some of the service providers may also provide (open) data.
Intermediaries include professionals and non-professionals that may support the co-creation activities by providing input for specific tasks in the co-creation process. They may become users of the applications developed.
Facilitators are experienced individuals in the work with older adults and/or groups. They support the co-creation activities through e.g. running workshops, focus groups, interviews.
Other organisations & individuals comprise for example senior citizen organisations, senior citizens’ clubs (e.g. computer clubs) but also media and journalists that may report about the co-creation activities, and thereby support engagement as well as dissemination.
D2.10 – Demonstrator applications testing and deployment
18 | P a g e
3 Methodology
This chapter presents the testing methodology followed to test the platform components. It contains a brief overview of both MADE and CIDER and their architecture, exposing the modules being tested and provides a test reporting template which is used to document the results in Chapter 4.
3.1 Mobile Age Deployment platform (MADE)
This section refers to the Mobile Age Deployment Platform (MADE) which primarily targets at offering to developers an environment, infrastructure and services that facilitates development, deployment and run of elderly-friendly applications. Towards this direction, the key design targets of MADE can be summarized in the following points:
• Provide a way to easily develop and deploy web and mobile applications covering state-of-the-art technology stacks.
• Provide access to a variety of open datasets and services on which the applications will be built upon.
• Provide access to a set of specifically created elderly-friendly front-end components.
• Enable usage analytics for their applications.
3.1.1 Architecture
This section specifies the MADE architecture which is based upon the aforementioned key design points.
In an effort to meet the primary design targets of Mobile Age, the technical architecture of the MADE is driven by the following key characteristics:
• Provision for multiple isolated environments and services interacting with each other through orchestration mechanisms provided by the system. This is enabled by the use of a Docker [1] infrastructure, providing a set of containers with different specifications that are created and started on-demand by the developers according to their needs.
• Incorporation of Software as a Service components that provide RESTful APIs [2], [3]. This is a fundamental decision that ensures extensibility and enables modular design, decoupling the services from the platform and making them accessible independently.
• Offering its full functionality via a web interface, essentially making the MADE accessible through any device that can run a web browser.
The MADE technical architecture is driven by the concept of having multiple isolated environments and services communicating with each other using the orchestration mechanisms provided by the same system. On top of that, MADE includes using state-of-the-art approaches in order to provide a user-friendly interface combining flexibility and efficiency of use and at the same time robust correspondence with the advanced functional requirements that originate from developers.
The key characteristic of the MADE is the use of multiple containers created on demand by MADE users; each being composed of different pieces according to the developers’ specifications. This decision enables all platform users to act independently without affecting the others and at the same time allowing them to have access to all the necessary resources and services provided by the system. As for the UI layer, MADE enables platform users to take
D2.10 – Demonstrator applications testing and deployment
19 | P a g e
advantage of the full functionality through their browser, making MADE accessible through almost every device that supports internet connection. Appropriate APIs are provided to facilitate the development of native applications.
Another MADE characteristic is the fact that it incorporates Software-as-a-Service components accessible to all application containers through RESTful APIs. The decision of providing services through independent modules that comply with the REST design principles reduce coupling between the platform components and at the same time ensure extensibility and fault tolerance.
Figure 6 illustrates the final MADE architecture which consists of modules belonging to three different conceptual layers:
• Users Layer
It refers to the MADE part directly exposed to developers and open data providers.
• PaaS Layer
It refers to all low-level modules that constitute the Platform-as-a-Service part.
• SaaS Layer
It refers to all modules integrated in the PaaS each of which acts as a different service. All modules that belong to the SaaS layer can be considered as independent entities each providing a different set of functionalities.
The modules that belonging at each of the aforementioned layers work together and ensure that the key design targets of MADE are met.
D2.10 – Demonstrator applications testing and deployment
21 | P a g e
3.1.2 List of PaaS Modules
• Web server
• Applications Module
• Local Git Repository
• OGD Proxy
• User Manager
• Applications Manager
• Containers Manager
• OSCPSEP Functional database
• DB Handler
• OGD Database
• OGD Search and Annotation Module
3.1.3 List of SaaS Modules
(Perhaps add more detail/submodules?)
• Search Component
• Annotation Component
• Reusable Front-Ends Module
• Behavioural Analytics Module
3.2 Co-creation Documentation EnviRonment (CIDER)
3.2.1 Architecture
Based on the aforementioned primary objective of the CIDER, the key architectural design points can be described in the following points:
• Provide a systematic way of describing co-creation data.
This point is actually referring to the creation of a formal data-model that serves two goals. The first is to provide a sufficient way to describe the data that originate from co-creation activities and the second is to follow the database design principles. Following those principles enables to create the necessary database infrastructure to store and handle the produces information.
• Provide a systematic way of providing co-creation data.
The data provision procedure involves defining a common way between all field sites to log their co-creation data in order to be able to store them into CIDER.
• Provide a way for analysing co-creation data.
The data analysis part is very important in the co-creation process as it enables extracting valuable information and facilitates the knowledge sharing procedure among different co-creators, while it also enables the formulation of both general and application specific reusable knowledge bases.
• Provide a way for reviewing co-creation information.
The term co-creation information refers to the original co-creation data along with their analysis results. The information review part involves several challenges. These challenges
D2.10 – Demonstrator applications testing and deployment
22 | P a g e
originate from the fact that information retrieval has to follow the semantics and the information flow of co-creators along with ensuring usability for the end-users.
Based on the aforementioned key design points, Figure 7 depicts the CIDER architecture which accommodates each of the above primary targets. The architecture consists of the following key parts:
✓ Data Model
The data model refers to a formal description of the co-creation data as a relational database. All co-creation activities in all four field sites are now being logged in a common format. This format complies with a data model developed together with WP3 partners and caters for the logging of co-creation activities. The data model is of utmost importance as it is the information basis upon which all the other modules will be built and is illustrated in Figure 8 and has gone through a number of revisions aiming to achieve a balance between genericness and specificity.
✓ Data Handler
This module consists of four parts. The data reader, the data validator, the data importer and the queries formulator.
▪ The data reader is responsible for reading excel templates based on the data model. Those excel templates are common between all field sites and are used by the co-creators in order to log their co-creation data.
▪ The data validator is responsible for handling the co-creation data by checking a number of rules imposed by the data model.
▪ The data importer is responsible for importing the co-creation data into CIDER. ▪ The queries formulator is responsible for creating the necessary queries in order
to extract the information necessary for various different analysis concepts.
✓ Web Application
The Web Application is the module responsible to expose all the co-creation information to the co-creators. This module consists of two distinct parts, the front-end and the backend. The backend is responsible for performing the data-handling and contains the database infrastructure where all the data is being stored.
D2.10 – Demonstrator applications testing and deployment
24 | P a g e
Figure 8 - Data model for co-creation activities
3.3 Demonstrator Applications
3.3.1 Bremen apps
Demonstrator Application at Bremen-Osterholz (Phase 1)
The demonstrator application for the first phase is built as an HTML based web-application using the technologies HTML5, JavaScript and CSS. The application is able to run in any mobile or desktop browser environment and with all common display resolutions. Additionally, it enables the developer to provide hybrid applications for all main mobile operation systems such as Android or iOS with minimal effort. It is developed with accessibility in mind and has been designed to work with built-in device accessibility options. In the Android operating system these are found under “Accessibility” within the device settings. For example, font and user interface elements can be increased or decreased in size system-wide. The demonstrator application follows two different concepts of providing access to the information. The first is text based and provides search forms and result lists. The second is based on a representation of city maps with layers of symbols, icons and small preview windows. Both paths end on a page with detailed information.
D2.10 – Demonstrator applications testing and deployment
25 | P a g e
In addition, the user interface also provides two different navigation concepts. The first is menu-based and uses a navigation drawer on the right side of the application. The navigation drawer provides direct access to the result lists and the map views at any time while using the app. The second concept is organized hierarchically and uses the buttons/links on the start page and the following pages to navigate down to the wanted information. With the “back” and the “start page” buttons the user is able to navigate upwards in the hierarchy.
Photo galleries give impressions of the locations to the user. The photos were taken and uploaded by the older adults that participated in the co-creation workshops. For maintaining the collected data all users have the option of sending updated data via a comment field below every information page.
Demonstrator Application at Bremen-Hemelingen (Phase 2)
The app of phase two is a technical evolution of phase one’s demonstrator application. The demonstrator application for the second phase is also built as a HTML based web application using the technologies HTML5, JavaScript and CSS. The application is able to run in any mobile or desktop browser environment and with all common display resolutions; additionally it enables the developer to provide hybrid applications for all main mobile operation systems such as Android or iOS with minimal efforts.
The development was continued with accessibility in mind and also this demonstrator supports the built-in device accessibility options. In the Android operating system these are found under “Accessibility” within the device settings. For example, font and user interface elements can be increased or decreased in size system-wide.
The user interface provides a menu-based navigation concept using the navigation drawer and the hierarchical organized navigation concept using the links on the start page. The application features an adapted map specially designed to meet the needs for older adults using higher contrast, reduced complexity and bigger text letters.
Photo galleries give impressions of places and walks to the user. The photos were taken and uploaded by the older adults who participated at the co-creation workshops.
For maintaining and collecting the data, a more user-friendly backend (than in phase 1) was used. All participants of the co-creation workshop were invited to use the app’s backend. The backend provides:
• Forms for the structured input of data associated with a walk. • Forms for the structured input of data associated with locations of a walk.
• Upload of photos and images associated with walks or locations.
• Connecting uploaded video clips to a walk or a location.
3.3.2 South Lakeland app
The demonstrator application is a cross-device app running on Android and iOS. Key features are discussed in the following paragraphs. The application has been designed specifically to integrate with standard device accessibility options. In the Android operating system these are found under “Accessibility” within the “Settings” app.
D2.10 – Demonstrator applications testing and deployment
26 | P a g e
The demonstrator application architecture was designed to accommodate the requirements from the co-creation process and supports the various types of scenarios.
Figure 9 - High-level architecture diagram
The architecture design for the demonstrator app is shown in Figure 9. This architecture consists of three main elements designed specifically to encourage use by older adults: an analytics framework that enables widespread data collection yet addresses privacy concerns raised by older adults, a data storage and exchange facility that aims to lower the barrier of use while providing a security model that reflects older adults’ ways of working, and a unified application framework that provides “a walled garden” that helps reassure older adults of the trustworthiness of applications. For example, applications within the demonstrator such as Events, Profile, Services and Volunteering, are isolated applications that live within the application framework and individually communicate with other components across the framework and external services.
3.3.3 Zaragoza apps
The demonstrator application for the demonstrator is built as a HTML based collaborative map application using the technologies HTML5, JavaScript and CSS. The application is responsive and can be displayed in any mobile or desktop browser environment with all common display resolutions. It also follows the Web Content Accessibility Guidelines (WCAG) version 2.0 (https://www.w3.org/TR/WCAG20/) at level AA. Following this guidelines ensures that this demonstrator application is also accessible to people with disabilities
The collaborative maps system uses Leaflet (http://leafletjs.com/) to visualize interactive maps, IDEZAR’s dataset as the data source and Oracle 11g to store the contents of collaborative maps.
D2.10 – Demonstrator applications testing and deployment
27 | P a g e
3.3.4 Thessaloniki app
The technologies used for the implementation of Thessaloniki application are summarised below:
• MERN Stack Programming [4]
1. MongoDB [5]: MongoDB is used for the implementation of the database needed for the application.
2. Express.JS [6]: As a framework for Node.JS Express.JS is used.
3. React.JS [7]: React.JS is used to implement the front-end (user interface) of the application
4. Node.JS [8]: For the implementation of the backend Node.JS is used.
• HTML5 [9]: To make use of the user’s current location and camera the HTML5 geolocation and camera API is utilized.
• Google OAuth [10]: For the initial setup of a user’s application profile, a gmail account is required and thus the Google Client Authentication module is used.
• Android Studio [11]: In order to “package” the web application into an Android WebView component, Android Studio is used to compile an android application to a .apk file.
Figure 10 - Architecture Overview Diagram
3.4 Results reporting template
In this section we provide a results reporting template that is going to be used for the modules testing procedure.
Test title
Test ID The test id
Steps The steps taken to perform this particular test
Expected Results
The results that are expected from executing the test.
D2.10 – Demonstrator applications testing and deployment
29 | P a g e
4 Results
4.1 MADE
4.1.1 PaaS Modules
Sign Up as Developer
Test ID made-1
Steps
1- Open Mobile Age Platform 2- Choose “Register Now” option 3- Fill the required fields 4- Click “Sign Up” button
Expected Results
The system output after the Signing Up should be either the Sign In page in the Mobile Age Platform, if all the information in the Sign Up form were valid or a list of errors, that describes the faults in the form completion.
Actual Results ✔ The test passes as the system output is what expected.
Additional Information:
Figure 1: Successful Sign Up Figure 11: Successful Sign Up
D2.10 – Demonstrator applications testing and deployment
30 | P a g e
Sign In as Developer
Test ID made-2
Steps
1- Open Mobile Age Platform 2- Choose “Sign in as a developer” button 3- Fill the required fields 4- Click “Sign In” button
Expected Results
The system output after the Signing In should be either the personalized Main Dashboard page in the Mobile Age Platform, if the credentials in the Sign In form were valid or a list of errors, that describes the faults in the form completion.
Actual Results ✔ The test passes as the system output is what expected.
D2.10 – Demonstrator applications testing and deployment
32 | P a g e
Creates new Application
Test ID made-3
Steps
1- Open Mobile Age Platform as Signed In Developer 2- Click “New Application” 3- Fill the required field 4- Click “Add Application” button
Expected Results
The system output after the application creation should either be a notification of success and a new application in the ‘My Applications’ menu with the given name or a notification of error in the bottom of the page.
Actual Results ✔ The test passes as the system output is what expected.
D2.10 – Demonstrator applications testing and deployment
33 | P a g e
Creates Development Environment of Application
Test ID made-4
Steps
1- Open Mobile Age Platform as Signed In Developer 2- Choose an application from “My Applications” menu 3- Choose one of the preconfigured containers 4- Click “Deploy Container” button
Expected Results
The system output after the creation of the development environment should be either the Dashboard of the application, or an error notification on the bottom of the page.
Actual Results ✔ The test passes as the system output is what expected.
D2.10 – Demonstrator applications testing and deployment
35 | P a g e
Expected Results
The system output after the upload of code should informs us about the procedure. If it was success then a message pop ups, else if some error happened, a notification with a list of the errors appears.
Actual Results ✔ The test passes as the system output is what expected.
Additional Information:
Synchronize code to application’s container
Test ID made-6
Steps 1- Open Mobile Age Platform as Signed In Developer 2- Choose an application from ‘My Applications’ menu 3- Click ‘Synchronize Code’ button in ‘Source Code’ tab
D2.10 – Demonstrator applications testing and deployment
36 | P a g e
Expected Results
The system output after the synchronization of the code should be the tree of the Git repository content, if the given password was correct, or an error message at the bottom of the screen, if something in the procedure failed.
Actual Results ✔ The test passes as the system output is what expected.
D2.10 – Demonstrator applications testing and deployment
37 | P a g e
Install application dependencies inside container
Test ID made-7
Steps 1- Open Mobile Age Platform as Signed In Developer 2- Choose an application from ‘My Applications’ menu 3- Click ‘Install Dependencies’ button in ‘Source Code’ tab
Expected Results
The system output after the installation of dependencies should be a success message at the bottom of the screen or an error message with a list of errors, if something went wrong.
Actual Results ✔ The test passes as the system output is what expected.
D2.10 – Demonstrator applications testing and deployment
39 | P a g e
Deploy application
Test ID made-8
Steps 1- Open Mobile Age Platform as Signed In Developer 2- Choose an application from ‘My Applications’ menu 3- Click ‘Deploy Application’ button in ‘Source Code’ tab
Expected Results
The system output after the installation of dependencies should be a success message at the bottom of the screen and the activation of ‘Preview Application’ button, which redirect the user to the application URL, or an error message with a list of errors, if something went wrong.
Actual Results ✔ The test passes as the system output is what expected.
D2.10 – Demonstrator applications testing and deployment
40 | P a g e
Deploy application
Test ID made-9
Steps 1- Open Mobile Age Platform as Signed In Developer 2- Choose an application from ‘My Applications’ menu 3- Click ‘Stop Application’ button in ‘Source Code’ tab
Expected Results
The system output after the termination of application should be a success message at the bottom of the screen and the termination of execution of the application in its specific URL, or an error message with a list of errors, if something went wrong.
Actual Results ✔ The test passes as the system output is what expected.
D2.10 – Demonstrator applications testing and deployment
42 | P a g e
Deploy application
Test ID made-10
Steps 1- Open Mobile Age Platform as Signed In Developer 2- Choose an application from ‘My Applications’ menu 3- Click ‘Start Container’ button in ‘Information’ tab
Expected Results
The system output after the activation of application’s container should be a success message at the bottom of the screen and the start of application’s container, or an error message with a list of errors, if something went wrong.
Actual Results ✔ The test passes as the system output is what expected.
Additional Information:
Figure 28: Successful start of application's container
D2.10 – Demonstrator applications testing and deployment
43 | P a g e
Stop container
Test ID made-11
Steps 1- Open Mobile Age Platform as Signed In Developer 2- Choose an application from ‘My Applications’ menu 3- Click ‘Stop Container’ button in ‘Information’ tab
Expected Results
The system output after the termination of application’s container should be a success message at the bottom of the screen and the stop of application’s container, or an error message with a list of errors, if something went wrong.
Actual Results ✔ The test passes as the system output is what expected.
Additional Information:
Figure 29: Unsuccessful start of application's container
D2.10 – Demonstrator applications testing and deployment
48 | P a g e
Steps 1- Send request to endpoint 2- Check if status code is OK and returns dataset id 3- Check is the dataset is registered in Virtuoso and CKAN
Expected Results
The endpoint should return an OK status code and the id of the dataset. Also the metadata of the registered dataset should be stored in Virtuoso and CKAN.
Actual Results ✔ The test passes as the system output is what expected.
Additional Information:
- The underlying technologies used in this endpoint is Mappingpedia, Virtuoso and CKAN.
- The test can be performed by sending a request to the endpoint http://83.212.100.226:8092/datasets{organizationID} with body parameters distribution_download_url, dataset_title and dataset_description.
D2.10 – Demonstrator applications testing and deployment
49 | P a g e
Expected Results
The endpoint should return an OK status code and the id of the dataset. Also the metadata of the registered dataset should be stored in Virtuoso and CKAN.
Actual Results ✔ The test passes as the system output is what expected.
Additional Information:
- The underlying technologies used in this endpoint is Mappingpedia, Virtuoso and CKAN.
- The test can be performed by sending a request to the endpoint http://83.212.100.226:8094/mappings/test-mobileage-upm/{datasetid} with body parameters mapping_document_download_url.
D2.10 – Demonstrator applications testing and deployment
51 | P a g e
Search module tests
Test ID made-16
Steps 1- Run the tests below from Django 2- Run the manage.py file with the parameter test 3- Each test will result in OK or Fail or Error
Expected Results
Testing search for dataset endpoints through Django should return
valid responses are working as expected.
Actual Results
✔ Tests if you can get a data set through the endpoint
✔ Tests with and without query parameter
✔ Test the usage of the q parameter
✔ Test the usage of the language parameter
✔ Test the usage of the category parameter
✔ Test the usage of the publisher parameter
✔ Test if endpoint is responding
✔ Test if endpoint provides data
✔ Check if service search can get only internal services
✔ Test if you can get the internal APIS through the internal API endpoint
✔ Test if you can get the full list of languages
✔ Test if you can get the list of languages in use
✔ Test if you can get the list of languages not in use
NOTES: There are a few tests that fail or crash. However they are not essential to our use of the module. Many of the failing tests are related to the native GUI (e.g for user tracking). We only use the API and therefore some of the test results are not relevant:
✖ Tests tracking page with many views
✖ Tests coding standards
✖ Test common requests on flask request
✖ Test legacy functional test tracking with count throttling
D2.10 – Demonstrator applications testing and deployment
52 | P a g e
Log in to analytics dashboard
Test ID made-17
Steps
1- Navigate to the Analytics Dashboar 2- Select “Login” 3- Enter login details 4- Select “Log in”
Expected Results
The system should redirect to the dashboard if provided a valid username and password combination. If any other combination is provided, an error should be shown to the user to enter correct details.
Actual Results ✔ The test passes as the system output is what expected.
D2.10 – Demonstrator applications testing and deployment
53 | P a g e
Figure 33: Dashboard after successful login with valid credentials
Figure 34: Result if invalid details are entered
Create new tracking id (logged in user)
Test ID made-18
Steps 1- A post request is made by a logged in user
Expected Results
When correct details are provided to the API, JSON data should be returned containing the following data: tid, name, description, owner, registrationSourceHost, created, disabled, pipeline.
Actual Results ✔ The test passes as the system output is what expected.
D2.10 – Demonstrator applications testing and deployment
54 | P a g e
Additional Information:
Figure 35: Request made using ‘Postman’ showing the form data, and JSON response from the server
Create new TID as anonymous user
Test ID made-19
Steps 1- An anonymous user posts a request for a new tracking
id.
Expected Results
When correct details are provided to the API, valid JSON data should be returned containing the following data: tid, name, description, owner, registrationSourceHost, created, disabled, pipeline.
Actual Results ✔ The test passes as the system output is what expected.
D2.10 – Demonstrator applications testing and deployment
55 | P a g e
Additional Information:
Figure 36: Request made using ‘Postman’ showing the form data, and JSON response from the server but with no authentication
Retrieve reports after login
Test ID made-20
Steps 1- Visit the dashboard after login.
Expected Results
Navigating to the dashboard after login, the set of reports and graphs should be displayed (incl. Users, Activity, Exit points, Events, Event category break-down). If the user has no tracking ids, the dashboard should not show any data.
Actual Results ✔ The test passes as the system output is what expected.
D2.10 – Demonstrator applications testing and deployment
56 | P a g e
Additional Information:
Figure 37: Analytics Dashboard - Showing the reports for a particular TID of a registered/logged in user
Retrieve reports by TID
Test ID made-21
Steps 1- Visit the dashboard not logged in. 2- Manually enter TID into text box 3- Select ‘Submit’
Expected Results
Navigating to the dashboard after entering a valid TID, the set of reports and graphs should be displayed (incl. Users, Activity, Exit points, Events, Event category break-down). If an invalid TID is entered, no data will show on the dashboard.
Actual Results ✔ The test passes as the system output is what expected.
D2.10 – Demonstrator applications testing and deployment
58 | P a g e
Actual Results ✔ The test passes as the system output is what expected.
Additional Information:
Figure 40: JSON result after an authenticated user requesting their TID
API endpoints 2 (Analytics event: POST)
Test ID made-23
Steps 1- POST request to push a new analytics event
Expected Results
When a set of valid JSON data is pushed to the API, a ‘201 created’ response should be returned. If any invalid sets of data are sent, error messages are returned with a ‘400 bad request’ response.
Actual Results ✔ The test passes as the system output is what expected.
D2.10 – Demonstrator applications testing and deployment
59 | P a g e
Additional Information:
Figure 41: Response to POST with a valid event data
Figure 42: Response to POST with invalid event data (includes which fields were invalid)
API endpoints 3 (Analytics event: GET)
Test ID made-24
Steps 1- GET request to fetch the most recent analytics events
for a given tracking id.
Expected Results
Given a valid TID, the system should respond with an array of JSON objects representing analytics events. If an invalid TID is requested, the system should return an error message.
Actual Results ✔ The test passes as the system output is what expected.
D2.10 – Demonstrator applications testing and deployment
60 | P a g e
Additional Information:
Figure 43: Response for valid GET request
4.1.4 Hackathon results
On January 31 2019, we hosted a hackathon on the AUTH campus. Students of various semesters attended the hackathon and tested and evaluated the usability and functionality of the Mobile Age Deployment & Engagement Platform. The students were asked a list of questions shown below. The results of those questions are depicted in the "Results" section.
Questions
Suitability for the task
Self-Descriptiveness
Controllability Conformity
1- The application facilitates the process of design and implementation of web apps
5- The application provides a good overview of its range of functions.
8- The application does not force unnecessarily rigid adherence to processing steps
11- The application reacts quickly
2- The application is easy to operate
Suitability for Learning
9- The application allows easy switching between screens.
Relevance
3- The application offers all the functions to cope with the tasks that arise.
6- The application requires little time to learn.
10- The application is designed in such a way that the user can influence how and which information is presented.
D2.10 – Demonstrator applications testing and deployment
64 | P a g e
4.2 CIDER
Accessibility testing with tota11y
Test ID cider-1
Steps
1- Inject the script of tota11y tool to the webpage
(http://khan.github.io/tota11y/) 2- Open the webpage and access the tota11y visualization
toolkit
Expected Results
The system should pass all accessibility tests that the tota11y script executes.
Actual Results ✔ The test passes as the system output is what expected.
Additional Information:
Some screenshots from the visualization of the procedure are presented below. Specifically, the screenshots are from the main page of the website where every annotation (Contrast, Headings etc) of tota11y was checked that it did not violate any rules. The same procedure was repeated for all the pages of the website.
D2.10 – Demonstrator applications testing and deployment
68 | P a g e
Quality testing with Lighthouse tool
Test ID cider-2
Steps
1. Go to
https://developers.google.com/web/tools/lighthouse/run 2. Enter https://co-creation.mobile-age.eu 3. View report
Expected Results
The system should pass all tests of Progressive Web App, Accessibility, SEO and Best practices.
Actual Results ✔ The test passes as the system output is what expected.
Additional information:
The summary report from the Lighthouse tool is shown to the images below. As we can see, the site scored perfectly on Progressive Web App, Accessibility, SEO and Best Practices. Finally, the Performance score is at 42, which represents the 70-75th percentile.
D2.10 – Demonstrator applications testing and deployment
72 | P a g e
4.3 Demonstrator Applications
In this section we present the functionality tests carried out for the demonstrator applications. This content has been reported in deliverables D4.2 – D4.5 (Prototype Demonstrator Applications at Bremen, South Lakeland, Zaragoza and Central Macedonia) respectively, and is included here for completeness, following the specified reporting format. The user is referred to those deliverables for more details.
4.3.1 Bremen apps
Functionality testing – Bremen Hemelingen
Test ID bremen-hemelingen
Steps
1. The navigation overview diagram is taken for reference 2. Each scene, view and transition of the app is tested to
check if it adheres to the navigation flow
Expected Results
See Table 4
Actual Results ✔ The test passes as the system output is what expected.
Description:
For the evaluation activities in Bremen Hemelingen the walk “Rund ums Bürgerhaus”1 was chosen and will be tested and evaluated in detail. It was the preferred choice, as the walk is just 2 kilometres long and provides locations, where the administrative parts and questionnaires can be filled.
Pre-conditions:
In order to check the functionality of the application, the navigation overview diagram is taken for reference. As the application evolved slightly since submitting D4.2, the transitions between the start page, the map page and the walk details pages have changed. All transitions between pages as well as functionalities within the pages are tested on a sample page. As all pages are based on the same templates, it can be assumed that all other pages of the same type also work properly.
D2.10 – Demonstrator applications testing and deployment
75 | P a g e
Item Functionality Result Remark
Button Benches works true
Button Lighting works true
Button Position works true
Button Close works true
4.3.2 South Lakeland app
Functionality testing – South Lakeland
Test ID south-lakeland
Steps
1. The navigation overview diagram is taken for reference 2. Each scene, view and transition of the app is tested to
check if it adheres to the navigation flow
Expected Results
See Table 5
Actual Results ✔ The test passes as the system output is what expected.
Description:
The testing methodology consists of the developer doing a walkthrough of the entire application, pressing all of the buttons and interacting with all of the features, testing that the full range of available functionality works as expected. This is supported by unit tests, which allow automated testing of functionality (e.g. input validation and boundary testing). The unit tests are created using Jasmine2, a development framework for testing JavaScript. To support testing of the client-side mobile application code (within Cordova), we also utilise jsdom3 to simulate a browser’s DOM, providing necessary functionality for our code to run outside of a browser.
Pre-conditions:
Then updated navigation diagram which was first shown in deliverable D4.3 (Prototype demonstrator applications at South Lakeland); it has been updated to include “Transport
D2.10 – Demonstrator applications testing and deployment
76 | P a g e
options” within the “Events” section. In addition to testing reported in this document, functional testing was performed throughout the co-creation process.
Figure 53 - Navigation Diagram - South Lakeland Social Connectedness Demonstrator
Results:
Table 5: South Lakeland - Results of Functional Tests
Item Functionality Result Remark
Launcher Is available, content and layout as expected
true
All navigational links/buttons work
true
Populate the launcher page with all available sub-sections/pages
true
Events Is available, content and layout as expected
D2.10 – Demonstrator applications testing and deployment
81 | P a g e
4.3.3 Zaragoza apps
Functionality testing – Zaragoza
Test ID zaragoza
Steps
1. The navigation overview diagram is taken for reference 2. Each scene, view and transition of the app is tested to
check if it adheres to the navigation flow
Expected Results
See Table 6
Actual Results ✔ The test passes as the system output is what expected.
Description:
For the evaluation activities in Zaragoza, the short walk of the Almozara district was chosen, tested and evaluated in detail together with the municipal website for older people carried out and evaluated within this project.
Pre-conditions:
In order to check the functionality of the application, the navigation overview diagram is taken for reference. All transitions between pages as well as functionalities within the pages are tested on a sample page. As all pages are based on the same templates, it can be assumed that all other pages of the same type also work properly.
D2.10 – Demonstrator applications testing and deployment
84 | P a g e
For the evaluation activities in the Central Macedonia Application Demonstrator, the whole application was tested for any functionality or accessibility issues. On the other hand, in order to test the application’s usability, the older adults had to perform two user-scenarios:
1. Scenario of finding and contacting (via phone) the nearest on duty hospital of that particular day.
2. Scenario of finding and navigating to the nearest open pharmacy that particular day.
Pre-conditions:
In order to check the functionality of the application, the navigation overview diagram is taken for reference. In this diagram, all the transitions between the application’s pages as well as functionalities within the pages are depicted. Following, in Table 7, the results of the functional tests are presented, for each page of the application.
Figure 55 - Navigation Diagram – Central Macedonia Demonstrator
D2.10 – Demonstrator applications testing and deployment
92 | P a g e
5 Conclusions
This document reports on the testing of the Mobile Age technical components, comprising MADE, CIDER and the field site demonstrator applications. Given the testing strategy, which involves a series of tests that evaluate diverse functionalities and usage scenarios, the test results ensure the proper behaviour of all components. As a result, and due to the platform architecture, which adopts the component-based development principles, we can safely conclude the proper behaviour of the platform as a whole. The same conclusions apply for the demonstrator applications.
D2.10 – Demonstrator applications testing and deployment
93 | P a g e
References
[1] “Docker”. [Retrieved December 2016]. [Online]. Available: https://www.docker.com/
[2] Richardson, L., & Ruby, S. (2008). RESTful web services. “O'Reilly Media, Inc.”.
[3] Christensen, J. H. (2009, October). Using RESTful web-services and cloud computing to create next generation mobile applications. In Proceedings of the 24th ACM SIGPLAN conference companion on Object oriented programming systems languages and applications (pp. 627-634). ACM.
[4] “MERN Stack Programming”. [Retrieved January 2019]. [Online]. Available: http://mern.io.
[5] “MongoDB”. [Retrieved January 2019]. [Online]. Available: https://www.mongodb.com
[6] “Express.JS Framework”. [Retrieved January 2019]. [Online]. Available: https://expressjs.com.
[7] “React JavaScript Library”. [Retrieved January 2019]. [Online]. Available: https://reactjs.org
[8] “Node.JS JavaScript Runtime”. [Retrieved January 2019]. [Online]. Available: https://nodejs.org/en/.
[9] “HTML5”. [Retrieved January 2019]. [Online]. Available: https://developer.mozilla.org/en-US/docs/Web/Guide/HTML/HTML5.
[10] “Google OAuth”. [Retrieved January 2019]. [Online]. Available: https;//developers.google.com/identity/protocols/OAuth2.
[11] “Android Studio IDE”. [Retrieved January 2019]. [Online]. Available: https://developer.android.com/studio/index.html
In order to evaluate the Mobile Age Deployment Platform apart from the four pilot prototypes that were developed and deployed, one (1) hackathon session was organized. It took place at the Department of Electrical and Computer Engineering at the Aristotle University of Thessaloniki on 31/01.2019.
The purpose of the hackathon was to get feedback from potential MADE users/customers familiar with the platform and its functionality and also get valuable feedback from them both in terms of functionality and/or usability.
Towards this direction, the procedure involved conducting two surveys, one prior to the hackathon and one after the hackathon. The pre-hackathon survey was conducted in order for us to gain a better understanding of the audience in terms of familiarity with the involved technologies, needs that occur during the software development process and development tools that they are already using. The post-hackathon survey was conducted in order to evaluate the platform on a series of different axes. First of all, it evaluated its functionalities both in terms of robustness and completeness as the participants will be able to compete end-to-end development scenarios and test whether the provided functionalities fit their needs. In addition, given that the software development process differs a lot between individuals, the participants will be able to test the platform in terms of usability. During this process, we focused on quantifying the following Usability-related characteristics (add reference: https://iso25000.com/index.php/en/iso-25000-standards/iso-25010/61-usability):
- Appropriateness recognizability.
- Learnability
- Operability
- User error protection
- User interface aesthetics
- Accessibility
Pre-Hackathon Survey Questions
- Have you ever been involved in back-end development for a web application?
- Have you ever been involved in front-end development for a web application?
- Which tools do you use for designing your application?
- Which of the following do you consider as the most difficult step while designing
an application?
- Have you ever dropped the idea of creating a certain application? If yes, what