Available REST APIs Available REST APIs

  • Profile & Social Networking: get profile and anagraphic information or boost your content's reach by making it easy for people to share it on Virtual Research Environments (VREs).  Get started
  • Information System: learn how to interact with resources hosted in the Infrastructure and its Virtual Research Environments (VREs). Get started
  • Workspace (Storage Hub): learn how to browse, upload and download user's workspace files and folders.  Get started
  • Resource Catalogue (gCat): learn how to publish and search collections of metadata for items including data, services, and related information objects.  Get started
  • Geoportal: learn how manage complex space-temporal documents, their materialisation and Indexing in different platforms while maximising reusability. Get started

Spatial Data Infrastructure Spatial Data Infrastructure

offers standard-based services for managing spatial datasets and associated metadata:

 

Identity and Access Management Identity and Access Management

D4Science Identity and Access Management (IAM) system uses state-of-the-art industry standards for authentication and authorization by fully adopting OpenID Connect (OIDC) for authentication and User Managed Authorization (UMA 2) for authorization flows. Both protocols are specializations of the generic OAuth 2.0 specification.

Learn how it works and how you can use it to obtain access to D4Science resources.

Shiny (Proxy) apps Shiny (Proxy) apps

A Shiny (proxy) app can be deployed in different ways:

  • It can be a public app available in Docker Hub or any other public container registry. In this case, the image name and the run command are the only requirements.
  • A build of a public image can be requested, it must be accessible from our Jenkins instance so that the process can be automated. The result container image will be uploaded into Docker Hub.
  • A build of a private image can be requested, it must be accessible from our Jenkins instance so that the process can be automated. The result container image will be uploaded into the D4Science's private registry.

For any of the above, request support for a new functionality.

Sign in with D4Science Sign in with D4Science

By using the OAuth 2.0 authentication protocol we allow an application to access D4Science data while protecting the member's credentials. Sign In with D4Science can be achieved by means of the OpenID Connect (OIDC) protocol, a thin layer that sits on top of OAuth 2.0 that adds login and profile information about the person who is logged in. To exploit this functionality you need this Openid-configuration Well-known URI and you must be authorised by D4science.

Request support for this functionality.

Analytics Engine Analytics Engine

Cloud Computing Platform (CCP)

  • How to implement custom Methods/Algorithms in the new Analytics Engine  Learn more
  • Interact with the UI and execute your custom Methods/Algorithms on the Cloud  Learn more
  • Interact with CCP via the Standard OGC API Processes Learn more

 Data Miner

  • Implement custom Methods/Algorithms for DataMiner: learn how to implement custom Methods/Algorithms for DataMiner  Learn more
  • DataMiner Manager: learn how to interact with DataMiner Manager Web App and execute your custom Methods/Algorithms on the Cloud.  Learn more
  • Interact with the DataMiner via Web Processing Service (WPS): learn how to interact with DataMiner Service via the OpenGISĀ® Web Processing Service (WPS). Learn more
  • Methods Engine Facilities: discover the features, services and methods for performing data processing and mining on information sets over D4Science nodes. Learn more

Docker Swarm Docker Swarm

A Docker Swarm cluster is available to deploy and run Docker containers. Only Docker containers are supported at this time and they can be deployed in different ways:

  • It can be a public container already available in Docker Hub or any other public container registry.
  • A build of a public image can be requested, it must be accessible from our Jenkins instance so that the process can be automated. The result container image will be uploaded into Docker Hub and deployed into the cluster.
  • A build of a private image can be requested, it must be accessible from our Jenkins instance so that the process can be automated. The result container image will be uploaded into the D4Science's private registry and deployed into the cluster.

For any of the above, request support for a new functionality.