Edinburgh’s Meadows to be Wired For Sound

The Internet of Things will help Edinburgh University understand the city’s public park in a whole new way.

Edinburgh-meadows.jpg

A new project from the University of Edinburgh will place sensors around the Meadows public park in Edinburgh’s city centre, to help researchers understand the ways in which people and nature coexist.

Sound sensors will capture ultrasonic and audible noises of bats, birds and other wildlife, as well as traffic and human activity, while others will record light levels, humidity and temperature.

The goal is to answer a range of questions about the city’s interactions with nature. Findings may reveal the activity of the park’s bat population; ways in which traffic noise influences animal behaviour or highlight seasonal variations in people visiting the park.

The outcomes could inform how land is used for the benefit of people, wildlife and the economy, and increase the quality of urban green space. Data from the project will also be used to inspire an interactive sound artwork.

The experiment forms part of the University of Edinburgh’s Internet of Things initiative, which is exploring ways in which internet-connected devices can enrich everyday life.

It is led by Edinburgh Living Lab, a city-wide collaboration founded by the City of Edinburgh Council and the University of Edinburgh, and is working with partner organisations such as the Scottish Wildlife Trust and community groups such as Friends of the Meadows and Bruntsfield Links.

Professor Ewan Klein of the University’s School of Informatics, told DIGIT:

“Sounds tell us so much about what’s happening around us – birds in a garden, a night on the town, or fire engines rushing to an emergency – but this can be pushed to the back of our awareness. The CitySounds project will be valuable in exploring and celebrating the richness of urban sounds, benefiting from developments in digital technology and network infrastructure.”

Yusef SamariFriends of the Meadows and Bruntsfield Links, said:

“We are delighted to be involved in this project, which is giving us an entirely new perspective on our much-loved park. We look forward to finding out what the sounds of the Meadows can tell us, particularly regarding its wildlife.”

Source

New Tech to Help Offshore Oil & Gas Workers

New wearables and App technology to provide real-time information access for offshore oil and gas platform managers and operatives.

eigen-Wearables-Eigen-Press-Release-Image_final_sml.jpg

Eigen, an offshore oil and gas digitalisation software provider and systems integrator, working in partnership with Lundin Norway has created a new app that enables real-time access to critical data via wearable devices. This instant and easy access to operational information can be granted to digital workers by using either an app or headset with a visor interface.

Using sophisticated image processing technology or by entering search terms, users can access information and use the system to automatically identify and display information related to equipment including:
• Documentation
• Live performance data
• Equipment parameters such as manufacturer or current space part inventory
• Maintenance history and future plans
• Any alerts or alarms that have been granted recently

Instant Direct Feedback Will Improve Efficiency

This new technology has been built on Eigen’s Ingenuity platform, a smart layer that provides contextual linking on top of existing systems. This extra digital capacity will enable engineers to better deploy their time as it will eliminate the time spent understanding statuses and readings. This direct feedback will assist in evolving more valuable refinements and increase overall operation efficiency.

Ingenuity has been developed and configured to run on Android-based wearables and will shortly be available on the App Store. The aim is that the technology will go into full production as of June this year. Eigen already has plans to trial its implementation on the Norwegian Continental Shelf in the near future.

Murray Callander, CEO of Eigen, said:

“This new technology solution enables digital working offshore, making critical information available in real-time to those who need to use it. The technology is a prime example of our approach of rapid evolution of operational capability enabled with Industry 4.0 technologies.”

“The Ingenuity platform enables the speed of innovation and solutions required to rapidly evolve and deliver new capabilities.”

“This demonstrates the value of our platform technology, combined with an approach to making incremental investment based on business cases that deliver value at each step.”

Source

World’s First Blockchain Identity Lab to be Built in Edinburgh

Edinburgh Napier University is building a pioneering new blockchain research laboratory as part of £600,000 collaboration.

BillBuchananNU.jpg

The new Blockpass Identity Lab (BIL) is to be built at Edinburgh Napier University’s Merchiston campus as part of a collaboration between the university and Hong-Kong based Blockpass. The purpose of the BIL is to explore ways in which blockchain technology can protect personal data from online scammers and hackers.

The creation of this lab will boost Napier’s already outstanding global reputation as a pioneering and leading centre for cybersecurity training and excellence. It comes at the right time when blockchain is coming to the forefront of data privacy.

After a series of huge data breaches at companies such as Uber and Equifax, blockchain offers an attractive alternative to the centralised storage of personal user data. This initial three-year partnership also includes funding for research staff, PhD studentships and a virtualised blockchain environment.

The announcement follows swiftly on the heels of Napier’s successful launch of its new cyber academy, SOCLAB. This new facility encourages collaboration between industry and academia to dramatically improve cyber security.

Dr Hans Lombardo, Blockpass Chief Marketing Officer said:

“We continue to see identity management at the forefront of blockchain and cryptography discussions as the price of consumer data abuses becomes clearer and more pertinent.”

“The creation of this lab in conjunction with Edinburgh Napier University will provide a space where further research and innovation can lead that discussion to newer and more advanced grounds.”

Professor Bill Buchanan of Edinburgh Napier’s School of Computing, the Director of the Lab, said:

“The world is changing and cryptography is now being used to fix many of the problems we have created on the internet. It can now help create a better society, with the citizen at its core.”

“We aim to contribute to the building of a new world, based on blockchain. Whether it is health and well-being, or the changing of our public services, it is likely to be blockchain methods that will provide the foundation for the future.”

Dr Sally Smith, Dean of the university’s School of Computing, said:

“This is another step forward in the advancement of our research and innovation, and builds on a strong track record of success. This collaboration builds a foundation for the future, and supports the development of advanced skills in blockchain research.”

Source

Scottish Tech Company Recreates Lost JFK Speech

Using 116,777 sound units from 831 of JFK’s speeches and radio addresses sound engineers in Edinburgh have been able to recreate the late president’s final speech in his own words.

JFK-2.jpg

55 years after the assassination of John F Kennedy, engineers at Edinburgh based company CereProc have used new technology to recreate his voice allowing us to hear the speech the president was due to give on that fatal day in Dallas.

After his unexpected and tragic death at the age of 46, on November 22nd 1963, the text of Kennedy’s intended speech was preserved for posterity and was used by researchers to create this 22-minute-long audio clip.

Alan Kelly, Executive Creative Director at ROTHCO who came up with the concept of JFK Unsilenced said: “I was watching a documentary about the president’s Dallas trip and I had never really thought about where he was on his way to when he was shot. I hadn’t heard about the speech and I didn’t know it existed.”

“It had obviously been written in advance, but I don’t think it had registered with many people. I looked it up online and was blown away by how prescient it is to today. By bringing his voice back to life to deliver this speech, the message is even more powerful.”

Recreating Kennedy’s Lost Speech

Commissioned by ROTHCO, the project was completed in partnership with CereProc and The Times newspaper. CereProc spent eight weeks painstakingly recreating the 2,590 words of Kennedy’s undelivered speech, which he had planned to give at the Dallas Trade Mart that day. Using machine learning and AI techniques the team were able to meticulously map Kennedy’s unique cadence and voice. By breaking down his  existing speeches and material the team could then stitch the content back together to create an entirely new speech.

Chris Pidcock, Co-founder and Chief Voice Engineer at CereProc explained: “There are only 40/45 phonemes in English so once you’ve got that set you can generate any word in the English language. The problem is that it would not sound natural because one sound merges into the sound next to it so they’re not really independent. You really need the sounds in the context of every other sound and that makes the database big.”

“Trying to harmonise the environment and manipulate the audio so that it ran together was quite difficult, Getting to that point is pretty challenging based on the variable audio quality, as well as the speech itself having different qualities and noise levels. One of the things we needed to do is get a very accurate transcription of the audio so that things like ‘umms’ and ‘ehs’ could be labelled and we could make sure the phonetic pieces we got were correct.”

“If you label them incorrectly you might pick the wrong piece and the whole sentence will sound wrong. We’ve been working on machine- learning models with deep neural networks to help predict the way that his intonation works so that we could build a more accurate predictor of where his speech would go and where he’d put his emphasis. We used that to improve the way that we could generate the speech output for Kennedy. That was a new thing that we had been testing but we hadn’t used in a real project before.”

Practical Applications for the Future

This model could soon be used in CereProc’s cloning tool to help people who are losing their voice due to illnesses such as motor neurone disease. The tool will allow individuals who are at risk of losing their ability to speak, to clone their voice, requiring only three to four hours of data to run clearly. They will not have to rely on synthetic computer generated voices, they can maintain more of their own unique personality.

Source

Fantastic Final Review for DICE

DICE logo.png

We are very proud t announce that after four very intense years of careful hard work, the DICE consortium has received a very warm and encouraging reception at the final review at the European Commission in Brussels.  The reviewers were particularly impressed by the quality of the work done in the project and complimented the standard of the written deliverables submitted.  Again DICE sets a standard by which other projects can follow.  The work done on dissemination was seen as effective and creative with great use of animated video produced by flexiOPS.  Overall the project has exceeded the promised efforts outlined in the Description of Work and we can be safe in referring to it all as an excellent body of research.  

DICE is an Open Source Dev Ops solution for Big Data applications.    
www.dice-h2020.eu

Release of the Final DICE Framework

Screen Shot 2018-03-22 at 09.29.22.png

After a 36-months R&D collaboration, the DICE consortium is pleased to announce the final release of the open source DICE framework and its two commercial versions DICE Velocity and DICE BatchPro.

DICE delivers innovative development methods and tools to strengthen the competitiveness of small and medium European ISVs in the market of business-critical data-intensive applications. The barriers that DICE breaks are the shortage of methods to express data-aware quality requirements in model-driven development and the ability to consistently consider these requirements throughout DevOps tool-chains during quality analysis, testing, and deployment of the application. Existing methodologies and tools provide these capabilities for traditional enterprise software systems and cloud-based applications, but when it comes to increasingly popular technologies such as, e.g., Hadoop/MapReduce, Spark, Storm, or Cassandra, it was difficult before DICE to adopt a holistic quality-driven software engineering approach. DICE delivers this capability, providing a quality-driven development environment for data-intensive applications.

In particular, DICE offers a DevOps methodology and platform covering multiple aspects of the lifecycle of a Big data application. A collection of 14 tools has been created and released as open source. The tools can guide in the definition of new Big data applications or in extending existing ones. A knowledge repository has been created to help end users to explore the different features of the tools, as well as to navigate through supporting tutorials and videos.

In particular, the open source release of the DICE framework is available free of charge and offers to development and operations teams:

  • An Eclipse-based IDE implementing the DICE DevOps methodology and guiding the user step-by-step through the use of cheatsheets
  • A new UML profile to design data-intensive applications taking into account quality-of-service requirements and featuring privacy-by-design methods
  • Quality analysis tools to simulate, verify, and optimize the application design and identify possible anti-patterns
  • OASIS TOSCA-compliant deployment and orchestration on cloud VMs and containers
  • Monitoring and anomaly detection tools based on the Elasticsearch-Logstash-Kibana stack
  • Runtime methods for configuration optimization, testing and fault injection
  • Native support for open-source Apache platforms such as Storm, Spark, Hadoop, and Cassandra.

The DICE framework is also available in two commercial versions focused on real-time applications (DICE Velocity) and batch processing system development and delivery (DICE BatchPro).

The DICE tools have been presented and are actively downloaded by a diverse group of stakeholders. Videos that illustrate cross-cutting benefits of the solution for different needs and use case scenarios are available on the DICE YouTube channel, together with tutorials on the DICE blog, as well as regular announcements on the DICE Twitter newsfeed.

ENTICE Has Received a Very Positive Final Review

ENTICE+BrxGroupShot.jpg

The team met at Tirol House in the European capital of Brussels for the final rehearsal meeting to fine tune presentations and prepare for the final review.  The project has been three years in the making and has overcome a series of technical challenges to finally deliver to the community a suite of tools that could can enhance the cloud experience for all users.  

Screen+Shot+2018-03-16+at+09.22.09.png

Our feedback was very positive and reviewers were very impressed with how well the team presented their work on the day.  There was a positive conversation also on how public data sets published by the ENTICE project will be of great interest to the research and development community.

All in it was a fruitful and pleasant day and gave great confidence to the consortium as we look ahead to our further exploitation and collaboration intentions. 

ENTICE Final Project Meeting in Innsbruck

Screen Shot 2018-02-20 at 14.31.38.png

The ENTICE team met for the final time ahead of the final review in Brussels on Wednesday the 14th of March where all of the ENTICE tools will be demonstrated live before the project officers.  The meeting was very productive and hosted by research partners UIBK in the ski resort city of Innsbruck, Austria amidst a mountainous landscape of icy peaks and snow powdered forests.  The clear mountain air certainly influenced the discussions and conclusions as the remaining loose ends were securely tied together with a clear vision of the road ahead.  

flexiOPS Use Case for ENTICE

Screen Shot 2018-02-20 at 14.25.35.png

Cloud computing has dramatically transformed the way data is stored and accessed.  Currently many services/applications use cloud-computing technology to provide their users means to store, process and access data. Although cloud computing is undoubtedly a huge success, there are some challenges and concerns. In terms of image deployment time it can take a considerable amount of time from choosing an image to getting it up and running. 

The deployment of the ENTICE tools to our FCO platform is still currently ongoing.  Despite this, some preliminary results have been gathered in order to validate the effectiveness of the tools running within our testbed.

The following results were gathered using the original Use Case images and compared with the same images after being optimised by the ENTICE VMI Optimiser.  These images were then downloaded into FCO and used to create new VMs in order to measure storage/time differences in various areas.  

Screen Shot 2018-02-20 at 14.28.00.png

As can be seen, with this ENTICE tool we have made a substantial difference to both the size of the VMI images, and the time difference in image deployment time and image boot time on FCO.  The ENTICE team are confident that our product will be ready to hit the ground running as a a sustainable and attractive solution for the cloud market.   

BEACON Has Been Approved!

beacon final meeting 3.jpg

The team met for a final preparation meeting at the offices of CETIC in Charleroi, Belgium to fine tune presentations and get ready to sit before the European Commission in Brussels.  We are very pleased to announce that the commission have passed our project which has taken 2 years to create and the BEACON product is now set to take it's part in the world of cloud federation as a living deliverable. 

beacon final meeting 1.jpg

BEACON Presented at FICloud 2017

Capture d’écran 2017-10-18 à 13.44.43.png

On 22/08/2017 Philippe Massonet presented the paper "Security in Lightweight Network Function Virtualisation for Federated Cloud and IoT" at FICloud 2017 (The IEEE 5th International Conference on Future Internet of Things and Cloud) in Prague, Czech Republic. The paper describes how the BEACON security architecture for federated cloud networking could be extended to federate with sensor networks. The paper proposes securing sensor networks at the edge using Network Function Virtualisation and Service Function Chaining. A key conclusion drawn from discussions at the conference is that there is a need for lightweight NFV/SFC. NFV/SFC solutions for clouds assume the availability of cloud resources to scale and adapt to processing demand. Sensor and actuator networks with limited processing and storage resources need lightweight NFV/SFC solutions.

BEACON Security Use Case Animation

See the BEACON project come to life with this real life use case animation which sets out the problems faced by hybrid cloud users who need migrate across clouds and how the BEACON solution fits into an SME's toolkit for protecting their client's VMs.  The Security Use Case Scenario was handled by flexiOPS and we are confident in what we have produced.  We very much look forward to the final review in October but look out for BEACON as the product takes to market.  

 

BEACON Represented at SummerSOC Poster Session, Crete

Managing Director of flexiOPS, Craig Sheridan and University of Messina Associate Professor Massimo Villari represented the BEACON project at a poster session at this year's Symposium on Service Oriented Computing on Crete. The well established summer school has proven to have been a great opportunity for generating interest and feedback on not only the concepts but also the results of BEACON.  

SummerSOC 2017, Crete

Craig Sheridan, Managing Director of industrial partner flexiOPS presented 'Deployment-time Multi-cloud Application Security' to the summer school on Crete this June.  The 11th Symposium on Service Oriented Computing heard the case for a concrete security baseline for VM applications with keen interest shown in the QA session.  

PAASAGE, QUORUM AND IBSCY : WINNING CUSTOMERS TO THE CLOUD

Quorum is a software solution that supports organisations, entity management and companies’ secretarial operations.  As well as assists their corporate compliance. It is used by major auditing, legal, trust and specialist providers offering corporate secretarial and other professional services in more than 25 countries worldwide.   Quorum governs principally entity management or company administration, contacts and clients management, KYC compliance, banking administration, time and billing.  The main benefits of using Quorum involve, optimising client and entity management operations, increasing client billing by better tracking and monitoring chargeable work.  Improve compliance in the quality of your work, security and traceability and reduce the opportunity for human error.  Manage information and documents accurately, reliably and efficiently.  Doing this, information becomes instantly accessible and available at the touch of a button and allows you to achieve high levels of productivity from your staff.  There are two very important advantages for companies that will use Paasage.  Increased flexibility this is because companies using Paasage are not bound to a single cloud provider, they can seamlessly switch a cloud provider simply by changing the cloud model.  Rapid elasticity, with the local cloud infrastructure in place it is very difficult, time consuming and costly to plan before hand for occasions that additional resources will be required.  IBSCY’s cloud strategy can be enhanced by Paasage as Paasage allows customers to deploy and move an application across multiple cloud providers and configurations.  Paasage helps IBSCY stay competitive and increase its flexibility so as to address diverse clients, cloud requirements, as well as scale on demand when more resources are needed.  To find out more goto Paasage.eu and get started today.  

ENTICE and Elecnor Deimos: Earth Observation

Let’s find out how ENTICE technology is helping to improve the Earth observation industry.  Earth observation is all about collecting spatial and temporal data of the world.  This data can be useful for all sorts of users in a diverse range of industries including monitoring the environment, observing natural disasters and civil security systems.  The last decade started with $200,000,000 worth of commercial sales in Earth Observation.  2010 saw the figure rise to $1.1 billion.  The forecast is to begin 2019 with $4 billion worth of sales.  It is a market that is growing at a steady rate.  In order to take advantage of this the European Commission in partnership with the European Space Agency and the European Environment Agency created the system Copernicus to provide Europe operational and autonomous capability to observe the Earth. Despite the importance of Earth Observation across multiple industries access to information obtained from satellites follows traditional and expensive paths to cover demand.  Of course this presents several drawbacks.  The cost of acquiring up to date images of the Earth is inhibitively expensive for new entrants to the market.  Existing customer s cannot access images directly and current methods require a great deal of processing and ad hoc delivery and the service lacks flexibility to cope with sudden changes in demand.  Here at ENTICE we believe that cloud computing could be the solution because cloud computing is scalable, it is flexible and it is globally accessible. 

Wellness Use Case

Let’s find out how Wellness Telecom are utilising ENTICE virtual machine image reduction technology to improve their services and win new customers.  Unified communication is an integrated and tailored service that allows you to have all business communication in the same application.  The custom images needed for the service are stored and managed by Wellness.  The users pay for the resources used in their storage.  While there are solutions to allocate extra resource to attend to unforeseen demand, there is often a drop in quality of service given the difficulty in meeting a spike in demand the challenge and business opportunity for us here at Wellness is to find a solution where users only pay for the resources they need without a reduction in quality.  Working  with ENTICE we have a solution that lets the service use new resources only when needed taking advantage of ENTICE’s faster deployment speeds and adapting to demand and as Wellness manages all tailored images needed for the service users leverage size reduce if provided by ENTICE to pay lower prices whilst we use less resource all round.  For more information about how ENTICE is helping businesses enhance their service and to learn about the innovations behind ENTICE go to entice-project.eu.  

We offer a catalogue of services which provides third party enterprise solutions.  These are aimed at companies that don’t have the knowledge to instal and deploy themselves.  The customer is billed based on resource used for their service and the storage utilised for virtual machine images.  Currently the images are not optimised leaving the customer paying for extra resources.  Our objective here at Wellness is that the customer only pays for the real resources that are needed.  By taking advantage of size reduction of virtual machine images offered by ENTICE we make our services more attractive, lower costs, improve competitiveness and reduce resource use.  ENTICE helps us pass resource savings along to our customers, winning us new business and making our service users happy.  

Budapest - Plenary Meeting

The team met up for their Plenary Meeting this January in Budapest to discuss discuss and present the progress so far ahead of the next commission review later in the year.  

BEACON Meeting in Madrid

The team met recently in Madrid at the OpenNebula offices to discuss the final phase of the project.  The team are happy to say that everything is on track and they look forward to the Open Stack Summit in Boston in May and also the Beacon workshop which is part of the SmartCOMP conference in Hong Kong.


The call for papers for this workshop is still open, the deadline being April 9th.  
See more here: http://fenci2017.unime.it/

Rich Client Platform for the DIA-integrated Development

DICE focuses on the quality assurance for data-intensive applications (DIA) developed through the Model-Driven Engineering (MDE) paradigm. The project aims at delivering methods and tools that will help satisfying quality requirements in data-intensive applications by iterative enhancement of their architecture design. One component of the tool chain developed within the project is the DICE IDE. It is an Integrated Development Environment (IDE) that accelerates the development of data-intensive applications.

The Eclipse-based DICE IDE integrates most of the tools of the DICE framework and it is the base of the DICE methodology. As highlighted in the deliverable D1.1 State of the Art Analysis, there does not exist yet any MDE IDE on the software market through which a designer can create models to describe and analyse data-intensive or Big Data applications and their underpinning technology stack. This is the motivation for defining the DICE IDE.

The DICE IDE is based on Eclipse, which is the de-facto standard for the creation of software engineering models based on the MDE approach. DICE customizes the Eclipse IDE with suitable plug-ins that integrate the execution of the different DICE tools, in order to minimize learning curves and simplify adoption. In this blog post we explain how the DICE tools introduced to the reader earlier have been integrated into the IDE. So, How’s the DICE IDE built?

 

How the DICE IDE is built?

The DICE IDE is an application based on Eclipse. While the Eclipse platform is designed to serve as an open platform for tool integration, it is architected so that its components could be used to build any arbitrary client application. The minimal set of plug-ins needed to build a rich client application is collectively known as the Rich Client Platform (RCP). Applications other than IDEs can be built using a subset of the platform. These rich applications are still based on a dynamic plug-in model, and the UI is built using the same toolkits and extension points. The layout and function of the workbench is under fine-grained control of the plug-in developer.

An Eclipse application consists of several Eclipse components, as a developer you can extend the Eclipse IDE via plug-ins (components). Eclipse applications incorporate runtime features based on OSGi. In this runtime environment, you can update/delete/create features to your application using OSGi Bundles (Components).

The minimum piece of software that can be integrated in Eclipse is called a plug-in. The Eclipse platform allows the developer to extend Eclipse applications like the Eclipse IDE with additional functionalities via plug-ins.

Eclipse applications use a runtime based on a specification called OSGi. A software component in OSGi is called a bundle. An OSGi bundle is also always an Eclipse plug-in. Both terms can be used interchangeably.

The Eclipse IDE is basically an Eclipse RCP application to support development activities. Even core functionalities of the Eclipse IDE are provided via a plug-in. For example, both the Java and C development tools are contributed as a set of plug-ins. Therefore, the Java or C development capabilities are available only if these plug-ins are present.

The Eclipse IDE functionality is heavily based on the concept of extensions and extension points. For example, the Java Development Tools provide an extension point to register new code templates for the Java editor.

Via additional plug-ins you can contribute to an existing functionality, for example new menu entries, new toolbar entries or provide completely new functionality. But you can also create completely new programming environments.

The minimal required plug-ins to create and run a minimal Eclipse RCP application (with UI) are the two plug-ins “org.eclipse.core.runtime” and “org.eclipse.ui”. Based on these components an Eclipse RCP application must define the following elements:

  • Main program – a RCP main application class implementing the interface “IApplication”. This class can be viewed as the equivalent to the main method for standard Java application. Eclipse expects that the application class is defined via the extension point “org.eclipse.core.runtime.application”.
  • A Perspective – it defines the layout of your application. Must be declared via the extension point “org.eclipse.ui.perspective”.
  • Workbench Advisor- invisible technical component which controls the appearance of the application (menus, toolbars, perspectives, etc.)

DICE Tools integration approaches

The Eclipse-based DICE IDE integrates most of the tools of the DICE framework. Due to the different nature of the tools, not all of them have the ability to get integrated completely within the IDE. It is necessary to provide a solution for that. Some of the tools have the real execution environment available outside the IDE (not eclipse plugins), for instance, in an external web site, or in an external server.

The DICE IDE offers two ways of get integrated:

  • Fully integrated
  • Externally integrated

Both integrations have a common component of integration within the IDE. This component contributes the IDE with a menu, through which the user can interact with all the integrated tools (Figure 1).

 

Figure 1. The menu for a DICE tool in the DICE IDE.

External integration:

This approach is the easiest. It is used when the real execution environment of the tool is placed outside the IDE, for instance within an external server or web service.

The only required information for this approach is to provide the needed information to connect to the external application, typically a URL:

  • Protocol: HTTP or HTTPS
  • Server: the address of the server
  • Port: the port where the server remains available
  • Parameters: possible parameters to be passed when the web service is visited (user id, token…)

There exists a plug-in that implements an abstract mechanism that is offered to all of the tools that prefers this kind of integration. This plug-in adds support to open the internal web browser of Eclipse with the given page, allowing the user to access to it within the IDE. An example of such an integration is given on the Figure 2 with the DICE Monitoring tool.

 

Figure 2. Example of Monitoring Tool, an external tool integration.

The IDE also provides an abstract Eclipse Preferences page that allows the user to modify these properties (Figure 3). In this way, the external web server tool integration can be modified dynamically by the user if needed.

 

Figure 3. Example of Monitoring Tool external web service configuration.

Full integration:

This approach requires much effort by the tool owner, as it is intended to develop a fully functional architecture in the IDE that allows the user to interact with the tool and perform all the needed operations.

It is required to have some Eclipse development skills. There are lots of Eclipse tutorials available on Internet that can be used to learn how to develop Eclipse plug-ins and contribute the IDE to provide new functionality like wizards, dialogs, launchers, views…

Depending on how complex is the tool, it will be more or less difficult to integrate it within the IDE.

The Figure 4 shows an example of fully integrated tool. In this case, it is the Simulation tool.

 

Figure 4. An example of the Simulation Tool, a fully integrated tool.

Conclusions

This post described the basic features of the DICE IDE, in particular the dual integration patterns provided by the integrated environment, and examples of integrated DICE Tools. Due to the different nature of the tools, not all of them have the ability to get integrated completely within the IDE. All tools, independently of the integration used, are accessible through the menu item.

The IDE has been released in January 2017 on GitHub as part of the DICE Knowledge Repository.  A complete tutorial and a Youtube channel allow any interested designers, administrators, quality engineers or system architect to start quickly with the IDE.

Christophe Joubert, Ismael Torres (PRO)