Coder Social home page Coder Social logo

wis2-poc-1's Introduction

OpenWis

From about OpenWis.

OpenWIS is an implementation of WMO Information System (WIS) and aims to perform the three functions required by the WIS, i.e. GISC, DCPC and NC. It is originally developed by AKKA Information System (France) on behalf of OpenWIS Association (with UKMO, MF, BoM, KMA, and MFI as the founding members)

1. Requirements

  • Oracle JDK 1.7
  • Maven 2.2.1

Environment variables:

The JAVA_HOME should be set with the valid JDK path.

The MAVEN_OPTS should be set with "-Xms256m -Xmx512m -XX:MaxPermSize=256m" to avoid OutOfMemory issue during the build.

The "mvn" command should be available in the terminal, maybe you should add the /bin directory to the PATH

2. Sources Contents

Repository Contents:

  .
  +-- pom.xml
  |+--openwis-dataservice
      |+--openwis-dataservice-cache
         |+-- openwis-dataservice-cache-core
         |+-- openwis-dataservice-cache-ejb
         |+-- openwis-dataservice-cache-ftpreplication (not used)
         |+-- openwis-dataservice-cache-test (not used)
         |+-- openwis-dataservice-cache-webapp
      |+--openwis-dataservice-common
         |+-- openwis-dataservice-common-domain
         |+-- openwis-dataservice-common-timer
         |+-- openwis-dataservice-common-utils
      |+--openwis-dataservice-server
         |+-- openwis-dataservice-server-ear
         |+-- openwis-dataservice-server-ejb
         |+-- openwis-dataservice-server-test (not used)
         |+-- openwis-dataservice-server-webapp
  |+--openwis-factorytests
  |+--openwis-harness 
      |+--openwis-harness-client
      |+--openwis-harness-dissemination
      |+--openwis-harness-localdatasource
      |+--openwis-harness-mssfss
      |+--openwis-harness-samples (not used)
      |+--openwis-harness-subselectionparameters
  |+--openwis-management
      |+--openwis-management-client
      |+--openwis-management-server
  |+--openwis-metadataportal
      |+--cachingxslt
      |+--jeeves
      |+--oaipmh
      |+--openwis-portal  
      |+--openwis-portal-solr
      |+--schematrons (not used)
      |+--sde
  |+--openwis-portal-client
  |+--openwis-securityservice
      |+--openwis-securityservice-war
      |+--openwis-securityservice-utils
		|+--GenerateSPConfFiles
		|+--PopulateLDAP
  |+--openwis-stagingpost
  |+--resources
  |+--openwis-libs
  • openwis-dataservice: contains the dataservice module (including the cache system)
  • openwis-factorytests: contains selenium functional tests
  • openwis-harness: contains harness WSDL (dissemination, localdatasource) and samples harness implementations
  • openwis-management: contains management module (monitoring and control)
  • openwis-metadataportal: contains portals module
  • openwis-release: contains some binary (JBoss, PostgreSQL, ...)
  • openwis-securityservice: contains the security module
  • openwis-stagingpost: contains the staging post web application
  • resources: contains jars required for build but not available on Public Maven repositories
  • openwis-libs: module which makes use of the maven install-file plugin in ordet to install openam and forgerock dependencies and must be built in the beginning with the profile 'dependencies'

Harness WSDL

  • The harness dissemination WSDL and WSD: openwis-harness/openwis-harness-dissemination/src/wsdl/
  • The harness localdatasource WSDL and WSD: openwis-harness/openwis-harness-localdatasource/src/wsdl/
  • The harness MSSFSS WSDL and WSD: openwis-harness/openwis-harness-mssfss/src/wsdl/
  • The harness subselectionparameters WSDL and WSD: openwis-harness/openwis-harness-subselectionparameters/src/wsdl/

3. Building OpenWIS

#1. Installing JavaEE

Javee is a library that is shipped with installations of GlassFish but is not available in any pulic Maven repository. In a terminal launch the command mvn clean exec:exec

#2.Launch the Maven build

In a terminal, go into the root directory and launch the command: mvn clean install -Pdependencies,openwis -DskipTests -Dfile.encoding=UTF-8

After few minutes the build should finished successfully like this:

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] ------------------------------------------------------------------------
[INFO] OpenWIS ............................................... SUCCESS [1.248s]
[INFO] openwis-libs .......................................... SUCCESS [2.572s]
[INFO] openwis-harness ....................................... SUCCESS [0.000s]
[INFO] openwis-harness-subselectionparameters ................ SUCCESS [6.724s]
[INFO] openwis-harness-localdatasource ....................... SUCCESS [1.232s]
[INFO] openwis-harness-client ................................ SUCCESS [0.796s]
[INFO] openwis-harness-mssfss ................................ SUCCESS [1.560s]
[INFO] openwis-harness-dissemination ......................... SUCCESS [1.264s]
[INFO] openwis-management .................................... SUCCESS [0.046s]
[INFO] openwis-management-client ............................. SUCCESS [4.134s]
[INFO] openwis-management-service ............................ SUCCESS [0.016s]
[INFO] openwis-management-service-common ..................... SUCCESS [1.045s]
[INFO] openwis-management-service-ejb ........................ SUCCESS [6.740s]
[INFO] openwis-management-service-ear ........................ SUCCESS [4.118s]
[INFO] openwis-dataservice ................................... SUCCESS [0.016s]
[INFO] openwis-dataservice-common ............................ SUCCESS [0.015s]
[INFO] openwis-dataservice-common-utils ...................... SUCCESS [1.513s]
[INFO] openwis-dataservice-common-domain ..................... SUCCESS [2.762s]
[INFO] openwis-dataservice-common-timer ...................... SUCCESS [2.761s]
[INFO] openwis-dataservice-cache ............................. SUCCESS [0.016s]
[INFO] openwis-dataservice-cache-core ........................ SUCCESS [0.998s]
[INFO] openwis-dataservice-cache-ejb ......................... SUCCESS [3.354s]
[INFO] openwis-dataservice-cache-webapp ...................... SUCCESS [1.728s]
[INFO] openwis-dataservice-server ............................ SUCCESS [0.011s]
[INFO] openwis-dataservice-server-ejb ........................ SUCCESS [4.073s]
[INFO] openwis-dataservice-server-webapp ..................... SUCCESS [1.295s]
[INFO] openwis-dataservice-server-ear ........................ SUCCESS [0.515s]
[INFO] openwis-dataservice-config ............................ SUCCESS [2.621s]
[INFO] openwis-portal-client ................................. SUCCESS [17.862s]	
[INFO] GeoNetwork opensource ................................. SUCCESS [0.000s]
[INFO] Caching xslt module ................................... SUCCESS [0.514s]
[INFO] Jeeves modules ........................................ SUCCESS [4.181s]
[INFO] Oaipmh modules ........................................ SUCCESS [1.155s]
[INFO] ArcSDE module (dummy-api) ............................. SUCCESS [1.560s]
[INFO] openwis-portal-solr ................................... SUCCESS [7.301s]
[INFO] openwis-securityservice ............................... SUCCESS [0.015s]
[INFO] openwis-securityservice-utils ......................... SUCCESS [0.000s]
[INFO] openwis-securityservice-utils-populate-ldap ........... SUCCESS [4.181s]
[INFO] openwis-securityservice-war ........................... SUCCESS [9.657s]
[INFO] openwis-stagingpost ................................... SUCCESS [0.327s]
[INFO] OpenWIS module ........................................ SUCCESS [36.687s]
[INFO] ------------------------------------------------------------------------
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESSFUL
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 3 minutes 43 seconds
[INFO] Finished at: Tue Dec 01 13:24:46 GMT 2015
[INFO] Final Memory: 237M/416M
[INFO] ------------------------------------------------------------------------	

#3. Generate the admin portal

In a terminal, go to the root directory and launch the command: mvn clean install -P admin -DskipTests -Dfile.encoding=UTF-8

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] ------------------------------------------------------------------------
[INFO] OpenWIS module ........................................ SUCCESS [26.004s]
[INFO] OpenWis ............................................... SUCCESS [0.402s]
[INFO] ------------------------------------------------------------------------
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESSFUL
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 26 seconds
[INFO] Finished at: Tue Dec 01 13:25:56 GMT 2015
[INFO] Final Memory: 42M/302M
[INFO] ------------------------------------------------------------------------

#4 Getting built artifacts

  • dataservice EAR: /openwis-dataservice/openwis-dataservice-server/openwis-dataservice-server-ear/target/openwis-dataservice.ear
  • dataservice configs: /openwis-dataservice/openwis-dataservice-config/target/openwis-dataservice-config-files.zip
  • harness client: /openwis-harness/openwis-harness-client/target/openwis-harness-client.jar
  • security service: /openwis-securityservice/openwis-securityservice-war/target/openwis-securityservice.war
  • user portal WAR: /openwis-metadataportal/openwis-portal/openwis-user-portal/openwis-user-portal-user.war
  • admin portal WAR: /openwis-metadataportal/openwis-portal/openwis-admin-portal/openwis-admin-portal-admin.war
  • SolR search WAR: /openwis-metadataportal/openwis-portal-solr/target/openwis-portal-solr.war
  • staging post WAR: /openwis-stagingpost/target/stagingPost.war
  • Management Service: /openwis-management/openwis-management-service/openwis-management-service-ear/target/openwis-management-service.ear

#5 Changing Version Number

In a terminal, go into the root directory and launch the command: mvn versions:set -DnewVersion=0.0.1-SNAPSHOT -P openwis or whatever version you wish to change the project to. In some cases, you may find that OpenWIS complains about missing dependencies, should this happen try running 'mvn versions:set -DnewVersion=0.0.1-SNAPSHOT -P openwis' followed by 'mvn versions:set -DnewVersion=0.0.1-SNAPSHOT -P admin'

#6 Maven Releasing

Maven releasing for OpenWIS is two step process as the portal requires the build to be run twice (once for admin portal and once for the user portal).

In a terminal, go into the root directory and launch the command: `mvn -Dresume=false release:prepare -Popenwis release:perform -Dgoals="install"' this needs to be followed by a build of the admin portal as desribed in #3

This will not deploy to the CloudBees repo as there are currentl permissions issues

#6 Running SonarCloud

Update OpenWIS pom.xml to include the following plugin under the PluginManagement section:

<plugin>	
    <groupId>org.sonarsource.scanner.maven</groupId>	
    <artifactId>sonar-maven-plugin</artifactId>	
    <version>3.7.0.1746</version>
</plugin>

In a terminal, execute the analysis with the command:

mvn sonar:sonar -Popenwis,admin,user -Dsonar.organization=${SONAR_ORGANIZATION} -Dsonar.projectKey=${SONAR_PROJECT_KEY} -Dsonar.host.url=${SONAR_URL} -Dsonar.login=${SONAR_KEY}

This does take a while given the number of modules that need to run.

Please contact OpenWIS-TC team for the values needed in the command above.

#7 Running OWASP Dependency Check

In order to run the OWASP Dependency Check, the following command must be run:

mvn -Popenwis,user,admin org.owasp:dependency-check-maven:aggregate

SonarQube supports the integration of the report in the analysis results, but unfortunately SonarCloud does not support this feature.

#7 Travis-CI Integration

alt text

OpenWIS project is integrated with Travis-CI and the build history can be accesses from here.

Copyright and License

(C) Copyright OpenWIS Association AISBL.

OpenWIS® is free software: you can redistribute it and/or modify it under the terms of the License specified for each OpenWIS® Project (see LICENSE).

wis2-poc-1's People

Contributors

dimipapadeas avatar nmichas avatar rogers492 avatar solson-nws avatar tg4444 avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

wis2-poc-1's Issues

Feedback on priority of Use Cases

OPP-6-WIS-Djibouti Use Cases

Mouktar's use cases:

{1.1} UCS-001: Upload new datasets
UCS-002: Create a Python script to check for 3SD for automatic detection of "bad data" from incoming sources.
{1.2} UCS-003: Use the local web application to automatically generate/edit most of HTML content of the dataset page
UCS-004: The dataset's HTML page automatically generates CSV visualization in the form of a graph
{1.3} UCS-005: Enhances HTML pages with JSON-LD information related to Google and AWISC
UCS-006: Register site-map with AWISC

Mariam's use cases:

{4.1} UCS-007: Discover weather observation dataset queues (using Google/AWISC)
{4.2} UCS-008: Subscribe to weather observation dataset queues
{4.3} UCS-009: Request re-send of missed notifications

Dave's use cases:

UCS-010: Search for weather observations registered in AWISC via the AWISC web application
UCS-011: Search for weather observations registered in AWISC via the AWISC web services

Mohamed's use cases:

{1.4} UCS-012: Discover observation data via Google Search

Omar's use cases:

{3.1} UCS-013: Create dissemination queues of an uploaded dataset
{3.2} UCS-014: Monitor dissemination queues of an uploaded dataset
UCS-015: Establish federated authentication via a trusted Identity Provider
UCS-016: Create Automatic Weather Station queues

Delly's use cases:

UCS-017: Upload fresh data automatically via web application

Regional data sharing hub's use cases (RDSH):

{3.3} UCS-018: Provide reliable & highly-available pub-sub messaging infrastructure

Authoritative WIS Catalogue's use cases (AWISC):

{2.1} UCS-019: Maintain a catalogue of official WIS datasets
{2.2} UCS-020: Periodically crawl registered pages and update internal index
{2.3} UCS-021: Provide a dataset search page
{2.4} UCS-022: Provide a dataset search (REST) API
{2.5} UCS-023: Has its site-map registered in Google search


Optional:

UCS-024: External service integration (e.g. extreme weather effects to crops bulletin)

UCS-023: Search AWISC through Google

Title 

UCS-023: Search AWISC through Google

Description 

The user visits the Google search page. From there he searches for authoritative weather datasets. Some search results link to AWISC.

Triggers 

The user wishes to find authoritative weather datasets.

Actors 

Public user.

Preconditions 

  • The AWISC's site-map has been registered in Google search.

Steps of Execution

  1. The user visits the Google search page.
  2. The user enters search terms that would match authoritative datasets on AWISC.
  3. Some of the search results link to AWISC.

Success Conditions

The user finds AWISC search result by using Google search.

Dependent Use Cases

UCS-020: Dataset page crawling and indexing

Title 

UCS-020: Dataset page crawling and indexing

Description 

The AWISC web application has spawned a process that crawls and indexes registered dataset pages every hour. This process goes through every registered root page, searches for links to other pages, and crawls them in a recursive fashion, searching for specific JSON-LD data (WMO codes). If that data is found, the application updates its search indexes with that information.

Triggers 

Occurs repeatedly, every hour.

Actors 

System.

Preconditions 

  • UCS-019: Maintain an Authoritative WIS Catalogue

Steps of Execution

  1. The AWISC web application loops through every registered dataset root page.
  2. The linked pages are examined for WMO codes.
  3. The AWISC search indexes are updated based on the discovered WMO codes.

Success Conditions

The registered dataset pages are searchable from AWISC.

Dependent Use Cases

  • UCS-021: Observation dataset queue discovery and subscription using the AWISC search page

UCS-013: Dataset dissemination queue creation

Title 

UCS-013: Dataset queue creation

Description 

Mouktar asks Omar to create a new queue in order to disseminate notifications for his new dataset. Omar uses the RDSH user interface and provides the required information in order to create a new queue for Mouktar. When all information is entered, the queue is created and becomes available to both Mouktar and potential subscribers.

Triggers 

Mouktar wants to disseminate notifications when new entries are added to his dataset.

Actors 

Mouktar, Omar.

Preconditions 

  • UCS-015: Omar is authenticated.
  • UCS-018: The RDSH is fully functional.

Steps of Execution

  1. Omar navigates to the "Create Queue" page of the RDSH user interface.
  2. Omar fills-in the required information in the form.
  3. Omar clicks on the "Create" button, and the queue is successfully deployed.

Success Conditions

The new queue is successfully deployed.

Dependent Use Cases

  • UCS-014: Dataset dissemination queue monitoring

UCS-014: Dataset queue monitoring

Title 

UCS-014: Dataset queue monitoring

Description 

Omar wants to access information concerning the status and general usage of the RDSH queues. He uses the RDSH user interface and gets access to data such as the following:

  • Number of connected clients
  • All-time maximum number of connected clients
  • Number of disconnecetd clients
  • Bytes sent
  • Bytes received
  • Number of sent messages
  • Number of received messages
  • Number of stored messages
  • Number of dropped messages
  • Number of in-flight messages
  • Number of active subscriptions

Triggers 

The user wants to monitor the dataset queues.

Actors 

Omar.

Preconditions 

  • UCS-015: Omar is authenticated.
  • UCS-018: The RDSH is fully functional.

Steps of Execution

  1. The user navigates to the "Queue Status" page of the RDSH user interface.
  2. The user views the information presented on the page.

Success Conditions

The user views status information concerning the system queues.

Dependent Use Cases

UCS-009: Re-transmission of undelivered notifications

Title 

UCS-009: Re-transmission of undelivered notifications

Description 

Mariam has not connected to her subscribed queues for several days. When her client connects, the queue then will automatically re-transmit all undelivered notifications messages, withing a maximum period of one week.

Triggers 

The user connects to the queue after a period of absence.

Actors 

Mariam, RDSH.

Preconditions 

  • UCS-007: The user has already subscribed to the specific queue.
  • The queue in question does have some undelivered messages for Mariam.

Steps of Execution

  1. The user connects to the queue after a period of time.
  2. The RDSH queue detects that there are undelivered notifications for that user.
  3. The RDSH queue re-transmits the undelivered notifications, while also transmitting real-time notifications.

Success Conditions

The user receives undelivered notifications from the subscribed queue(s).

Dependent Use Cases

UCS-002: Automatic Weather Station quality control

Title 

UCS-002: Automatic Weather Station quality control

Description 

The user creates a script that will process incoming data from the automatic weather station queue.
This script will search for values that exceed 3 times the Standard Deviation of the mean value for registered observations.
If such deviations are detected, am email is sent to Muktar, prompting him to double-check the incoming data.
Otherwise, the data are placed on the outgoing dissemination queue.

Triggers 

Data arrives via the Automatic Weather Station queue.

Actors 

Local Web Application (System).

Preconditions 

  • UCS-016: The automatic weather station queue exists.
  • UCS-013: The outgoing dissemination queue exists.
  • The quality control script exists.

Steps of Execution

  1. The weather observation data arrives to Mouktar's local web application via the automatic weather station queue.
  2. The quality control script is executed on the weather observation data.
  3. The weather observation data that are within the accepted range are placed on the outgoing dissemination queue.
  4. The weather observation data that exceed the allowed deviation are temporarily witheld and a notification email is dispatched to Mouktar.

Success Conditions

Correct weather observation data are properly disseminated and incorrect weather observation data are validated by Mouktar.

Dependent Use Cases

None.

Installation Guide

Hi @solson-nws,

At a previous scrum, @NMichas had suggested that we create an installation guide for the system's applications. The suggestion was received well, and I can propose two different ways to go about it.

  1. Create an installation guide for the system in its current state: That would include instructions for the download and configuration of necessary software, deployment instructions, etc.

  2. Invest effort in packaging the system more efficiently: This will include dockerization of required software, automated (to the highest extend possible) deployment of the applications, etc.

My suggestion would be to take option 2. It is not certain that we will be able to provide an "one-line installation" - since there may be some configuration settings which could require manual intervention - but we will be very close to it.

Please let me know what you think about it.

UCS-018: Regional Data Sharing Hub reliability & high availability

Title 

UCS-018: Regional Data Sharing Hub reliability & high availability

Description 

The Regional Data Sharing Hub infrastructure provides the required components that guarantee reliable message dissemination and high availability. The system provides multiple dockerized Messaging Server instances behind a dockerized load balancer.

Triggers 

  • The RDSH receives data for dissemination.

Actors 

RDSH.

Preconditions 

Steps of Execution

  1. The data for dissemination is placed on the Messaging Server queue.
  2. The data is transmitted to all active subscribers.
  3. The data is stored so that it will be re-transmitted to inactive subscribers.
  4. The load balancer ensures that load is distributed evenly on the available Messaging Server instances.
  5. Taking a Messaging Server instance offline does not cripple the RDSH, since the rest of the instances handle the load.

Success Conditions

The message dissemination takes place without problems during high loads and Messaging Server instance failures.

Dependent Use Cases

  • UCS-013: Dataset queue creation
  • UCS-014: Dataset queue monitoring

Structured data for Google search

We have attempted having our JSON-LD take part in Google search but we have not managed to do so, even though theoretically Google "should" take them into account.
There are no errors, and the tools provided by Google show that they indeed read the JSON-LD.

The current situation is that we appear in Google's results if we search for text contained in the web-page, but if we include any keywords present in the JSON-LD then Google simply cannot find us.

Jeremy had mentioned that he has some connections withing Google, and if we had the chance it would be good to consult with them and find a solution to the problem.

CBS TECO demo - video script

Hi and welcome to OpenWIS 2.0 proof of concept. Let's walk you through some work we have been doing over the last couple of months.


So, here is the current OpenWIS we all know and love.
People are searching for weather products all over the world. Data producers are producing data, and GISCs harvest and index data.
but... real-time access to data is limited.


So we asked... how can we make it better?


We'll tell you a story... but let's first introduce our heroes!
Here's Mouktar who runs a small server in a remote location.
He's not a computer expert, so he installs a Local Data Sharing Hub, using a fully automated procedure that only takes a few minutes.


Say hello to Omar.
Omar is pretty good with computers and wants to distribute Mouktar's data.
Omar installs a Regional Data Sharing Hub using a fully automated procedure that only takes a few minutes.


Hey Vicky!
Vicky works for a large association and wants to provide a comprehensive, curated list of weather-related products and data.
Vicky install an AWISC - the Authoritative WIS Catalogue.


Here's Mohamed and Dave.
Mohamed is working outdoors and wants to google today's weather conditions.
Dave is a seasoned meteorologist.
Ohh, it seems we forgot Mariam! Mariam is working on a weather prediction model for which she needs real-time access to Mouktar's data.


So, these are our heroes.
Mouktar, running the Local Data Sharing Hub.
Omar, running the Regional Data Sharing hub.
Vicky, running the Authoritative WIS catalogue.
Mohamed and Dave searching for information
and Mariam wanting real-time access to data.


So here's how the system works:
As soon as Mouktar pushes data to its local system, it is automatically transmitted in real-time to Omar's system.
Vicky's system is indexing Mouktar's data too, to update its local list of curated entries.
Google is also indexing Mouktar's data.


When Mohamed is looking for data, he can use google like he is used to.
Dave, wants and expert search interface, one that google cant' provide, so he is using AWISC.
Mariam can subscribe to the RDSH to get notifications as soon as new data is available.

UCS-007: Observation dataset queue discovery and subscription using Google search

Title 

UCS-007: Observation dataset queue discovery and subscription using Google search

Description 

The user uses the Google search page to search for weather observation dataset queues in the location of interest. Among the many search results, there exist some that link to the pages created by Mouktar. When the user clicks such a link, they are redirected to Mouktar's page where they find the resource name required in order to subscribe to the queue.

Triggers 

The user wishes to subscribe to a weather observation dataset queue.

Actors 

Mariam.

Preconditions 

  • UCS-003: Mouktar's dataset pages exist

Steps of Execution

  1. The user uses the Google search page to search for weather observation dataset queues in the location of interest, and is presented with search results, including some linking to Mouktar's pages.
  2. The user clicks on a link from one of Mouktar's pages, and is redirected there.
  3. The user discovers the queue resource name whithin the information provided from Mouktar's page.
  4. The user enters the queue information on his client software and subscribes to notifications.

Success Conditions

The user receives notifications from the subscribed queue(s).

Dependent Use Cases

UCS-019: Maintain an Authoritative WIS Catalogue

Title 

UCS-019: Maintain an Authoritative WIS Catalogue

Description 

The user navigates to the "Authoritative Dataset Registration" page of the AWISC web application. There fills-in a form providing the following data:

  • Dataset name
  • Dataset code
  • Dataset root URL
  • Contact (The user's contact details)
  • WMO approval (The WMO permanent representative's contact details)

Triggers 

  • The user wants to make his dataset pages accessible from AWISC.

Actors 

Mouktar.

Preconditions 

  • UCS-015: The user is authenticated.
  • UCS-003: The HTML pages that host the weather observation information are available.

Steps of Execution

  1. The user navigates to the "Authoritative Dataset Registration" page of the AWISC web application.
  2. The user provides all necessary information in the displayed form, and submits it.
  3. The system crawls the dataset root page and scans for dataset information.
  4. The system displays the found results for the user to see.

Success Conditions

The user registers his authoritative dataset and visually verifies that AWISC crawled his poages successfully.

Dependent Use Cases

  • UCS-020: Dataset page crawling and indexing
  • UCS-021: AWISC dataset HTML search page
  • UCS-022: AWISC dataset search Web-API
  • UCS-023: Search AWISC through Google

CBS TECO - demo

@6a6d74 (JT) share the TECO agenda. In particular we want to know the official date/time of the demo session.

UCS-001: Upload new datasets

Title 

UCS-001: Upload new datasets

Description 

The user obtains weather observations from multiple locations.
He assembles a CSV file per new observation, per location.
He uploads the CSV file by FTP in order to make the data available.
The uploaded data is processed by the system and entered into a database.

Triggers 

The user obtains weather observations.

Actors 

Mouktar.

Preconditions 

  • UCS-015: The user is authenticated.
  • UCS-003: The HTML pages that host the weather observation information are available.
  • The user has access to a properly configured FTP client.
  • The user has obtained obtained weather observations.
  • The user has assembled the CSV files that will be uploaded.

Steps of Execution

  1. Each CSV file is uploaded to the FTP server.

Success Conditions

The weather observation data will be inserted into the database and are thus available for dissemination.

Dependent Use Cases

None.

UCS-021: Observation dataset queue discovery and subscription using the AWISC search page

Title 

UCS-021: Observation dataset queue discovery and subscription using the AWISC search page

Description 

The user visits the AWISC search page. There they perform another search for observation dataset queues in the location of interest. Mouktar's pages are the only results as only he provides authoritative records for this location. When the user clicks the result link, they are redirected to Mouktar's page where they find the resource name required in order to subscribe to the queue.

Triggers 

The user wishes to subscribe to a weather observation dataset queue.

Actors 

Mariam.

Preconditions 

  • UCS-003: Mouktar's dataset pages exist
  • UCS-020: The dataset pages are already indexed by AWISC

Steps of Execution

  1. The user uses the AWISC search page to search for weather observation dataset queues in the location of interest, and is presented with search results including only links to Mouktar's pages.
  2. The user clicks on a link from one of Mouktar's pages, and is redirected there.
  3. The user discovers the queue resource name whithin the information provided from Mouktar's page.
  4. The user enters the queue information on his client software and subscribes to notifications.

Success Conditions

The user receives notifications from the subscribed queue(s).

Dependent Use Cases

CBS TECO demo - Decide how to get feedback on PoC

Before the TECO demo, decide how we want to invite/get feedback on the PoC demo. Do we want to make the PoC software available? To registered users only? How do we handle that registration process? Would that mean we had to create some user docs?

UCS-003: Dataset HTML page content management

Title 

UCS-003: Dataset HTML page content management

Description 

The user navigates to the "Create HTML Content" page in order to create a new web-page for a dataset. Therein, he is presented with a form in which he enters the following information:

  • Dataset description: Name, location information, license, file format, etc.
  • Google JSON-LD : The desired structured data that will make the page searchable in Google.
  • WMO JSON-LD : Dynamic selection of WMO codes (through textual reference, instead of code name) that should be registered in order for the page to be searchable from AWISC.

Upon form submission, a new web-page is automatically created and made available to the public.

Triggers 

The user wishes to upload/disseminate new weather observations.

Actors 

Mouktar.

Preconditions 

  • UCS-015: The user is authenticated.

Steps of Execution

  1. The user navigates to the "Create HTML Content" page of the local web application, and is presented with a form.
  2. The user fills the required fields with dataset information, as well as JSON-LD structured data, then submits it.
  3. The system automatically generates an HTML page containing the provided information and makes it available to the public.

Success Conditions

The new dataset HTML page is made available to the public.

Dependent Use Cases

  • UCS-004

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.