https://w3id.org/BCI-ontology#
Mappings to SOSA/SSN: At the beginning, the BCI ontology was developed following closely its alignment to the [oldSSN] spec. By mid-July 2017, as the new version of SSN released by W3C ([SSN]) had reached the Candidate Recommendation <a title="Semantic Sensor Network Ontology | W3C Candidate Recommendation (11 July 2017) status, the core concepts were "remapped" to the new SOSA/SSN definitions based on its <a title="Semantic Sensor Network Ontology | W3C Recommendation (19 October 2017): (6) Vertical Segmentation -- (6.2) SSNX Alignment Module SSNX Vertical Alignment: [ [ [<th style="text-align: center;" width="100 bci<th style="text-align: center;" width="60 (mapping)<th style="text-align: center;" width="160 Initial mapping: oldssn<th style="text-align: center;" width="170 New mapping: SOSA/SSN<th style="text-align: center;" width="335 SSNX & BCI-O Remarks] [<th style="text-align: center;" colspan="5 Classes] [ [bci:Aspect] [] [oldssn:FeatureOfInterest] [sosa:FeatureOfInterest] [ (distinction between observation and actuation targets)] ] [ [bci:Modality] [] [oldssn:Property] [sosa:ObservableProperty] [(oldssn:Property ssn:Property);(sosa:ObservableProperty ssn:Property)] ] [ [bci:StimulusEvent] [] [oldssn:Stimulus] [ssn:Stimulus] [] ] [ [bci:Device] [] [oldssn:SensingDevice] [sosa:Sensor] [(sosa:Sensor oldssn:Sensor); (oldssn:SensingDevice oldssn:Sensor)] ] [ [bci:Record] [] [oldssn:Observation] [sosa:Observation] [oldssn:Observation: combination of oldssn axiomatic statements.] ] [ [bci:RecordedData] [] [oldssn:SensorOutput] [sosa:Result] [(oldssn:SensorOutput sosa:Result) and (combination of oldssn axiomatic statements); (alignment to sosa:Result)] ] [ [bci:DataBlock] [] [oldssn:ObservationValue] [sosa:Result] [(oldssn:ObservationValue sosa:Result) and (combination of oldssn axiomatic statements); (alignment removed)] ] [ [bci:Channel] [] [oldssn:MeasurementCapability] [ssn-system:SystemCapability] [(oldssn:MeasurementCapability ssn-system:SystemCapability) and (combination of oldssn axiomatic statements)] ] [ [bci:NonChannel] [] [oldssn:MeasurementCapability] [ssn-system:SystemCapability] [(oldssn:MeasurementCapability ssn-system:SystemCapability) and (combination of oldssn axiomatic statements)] ] [ [bci:SamplingRate] [] [oldssn:Frequency] [ssn-system:Frequency] [(oldssn:Frequency ssn-system:Frequency)] ] [ [bci:DeviceSpec] [] [oldssn:SensorDataSheet] [ ] [unchanged] ] [ [bci:Actuation] [] [ ] [sosa:Actuation] [new] ] [ [bci:Actuator] [] [ ] [sosa:Actuator] [new] ] [ [bci:ImpactedProperty] [] [ ] [sosa:ActuatableProperty] [new] ] [ [bci:ActuationResult] [] [ ] [sosa:Result] [new] ] [ [bci:ActuationTarget] [] [ ] [sosa:FeatureOfInterest] [new] ] [<th style="text-align: center;" colspan="5 Object Properties] [ [bci:hasModality] [] [oldssn:hasProperty] [ssn:hasProperty] [] ] [ [bci:isModalityOf] [] [oldssn:isPropertyOf] [ssn:isPropertyOf] [] ] [ [bci:aspectOfInterest] [] [oldssn:featureOfInterest] [sosa:hasFeatureOfInterest] [] ] [ [bci:madeRecord] [] [oldssn:madeObservation] [sosa:madeObservation] [] ] [ [bci:observedByDevice] [] [oldssn:observedBy] [sosa:madeBySensor] [] ] [ [bci:observedModality] [] [oldssn:observedProperty] [sosa:observedProperty] [] ] [ [bci:detects] [] [oldssn:detects] [ssn:detects] [] ] [ [bci:isProxyFor] [] [oldssn:isProxyFor] [ssn:isProxyFor] [] ] [ [bci:hasValue] [] [oldssn:hasValue] [sosa:hasResult] [ (deprecated to simplify the model)] ] [ [bci:isProducedByDevice] [] [oldssn:isProducedBy] [ ] [not defined in SOSA/SSN; deprecated. defined as the following role inclusion axiom: (bci:isObservationResultOf sosa:isResultOf) * (bci:observedByDevice sosa:madeBySensor) ⊆ (bci:isProducedByDevice)] ] [ [bci:observationResult] [] [oldssn:observationResult] [sosa:hasResult] [] ] [ [bci:forModality] [] [oldssn:forProperty] [ssn:forProperty] [] ] [ [bci:hasNonChannelData] [] [oldssn:hasMeasurementCapability] [ssn-system:hasSystemCapability] [oldssn:hasMeasurementCapability ssn-system:hasSystemCapability] ] [ [bci:observes] [] [oldssn:observes] [sosa:observes] [ in combination with the property-chain axioms] ] [ [bci:ofAspect] [] [oldssn:ofFeature] [ ] [not defined in SOSA/SSN; deprecated.] ] [ [bci:includesEvent] [] [DUL:includesEvent] [ ] [not used in SOSA/SSN; deprecated. A new property has been defined: ssn:wasOriginatedBy] ] ] ] Symbolic notation: [ [ [ [aligned to (subclass or sub-property of)] [] ] [ [equivalent concepts] [] ] [ [the new mapping implies a conceptual update] [] ] ] ] As for January 2018, due that all alignments were updated to SOSA/SSN core classes, the new mappings are based now on the <a title="Semantic Sensor Network Ontology | W3C Recommendation (19 October 2017): (6) Vertical Segmentation -- (6.1) Dolce-Ultralite Alignment Module Dolce-Ultralite Alignment Module of the Vertical Segmentation of [SSN]. Thus, the BCI ontology imports the ssn-dul definitions. As one of the ontologies ("<a title="OGC & W3C, On the usage of the SSN ontology (W3C Document): (3) Usage in ontologies (Producers) concept producers") that reuse SSN, BCI-O was selected as part of the analysis <a title="OGC & W3C, On the usage of the SSN ontology (W3C Document) on the usage of SSN.
(*) [Compton2009] Compton, M.; Neuhaus, H.; Taylor K. and Tran, K. Reasoning about Sensors and Compositions. In Proceedings of the 2nd International Workshop on Semantic Sensor Networks (SSN 09) at ISWC 2009, pp. 33-48, 2009. URL=http://ceur-ws.org/Vol-522/p7.pdf. (*) [ESS] SCCN, "EEG Study Schema (ESS)". Resources: [ESS@SCCN], [ESS v2.0]. (*) [HED] N. Bigdely-Shamlo, K. Kreutz-Delgado, M. Miyakoshi, M. Westerfield, T. Bel-Bahar, C. Kothe and K. Robbins, "Hierarchical event descriptor (HED) tags for analysis of event-related EEG studies". Resources: [HED@SCCN], [HED v2.0]. (*) [OWL-Time] OGC W3C, "Time Ontology in OWL (W3C Recommendation 19 October 2017)", https://www.w3.org/TR/owl-time/ (*) [Seydoux2016] Seydoux, Nicolas; Drira, Khalil; Hernandez, Nathalie; Monteil, Thierry. IoT-O, a Core-Domain IoT Ontology to Represent Connected Devices Networks. 20th International Conference on Knowledge Engineering and Knowledge Management - Volume 10024 (EKAW 2016). Bologna, Italy. 2016. Pp. 561-576. DOI=http://dx.doi.org/10.1007/978-3-319-49004-5_36. URL=https://dl.acm.org/citation.cfm?id=3092997. See also: (*) [SAN] The "Semantic Actuator Network (SAN)" ontology. http://lov.okfn.org/dataset/lov/vocabs/SAN. (*) [AAE] The "Actuation-Actuator-Effect (AAE)" design pattern. http://ontologydesignpatterns.org/wiki/Submissions:Actuation-Actuator-Effect http://ontologydesignpatterns.org/wiki/Submissions:Actuation-Actuator-Effect. (*) [Shafer2001] Shafer, Steven A. N.; Brumitt, Barry; Cadiz, J. J. Interaction Issues in Context-aware Intelligent Environments. Human-Computer Interaction. Volumen 16, Issue 2 (December 2001), Pp. 363-378. DOI=http://dx.doi.org/10.1207/S15327051HCI16234_16. URL=http://dl.acm.org/citation.cfm?id=1463124. (*) [SSN] OGC W3C, "Semantic Sensor Network (W3C Recommendation 19 October 2017)", https://www.w3.org/TR/vocab-ssn/.W3C Spatial Data on the Web Working Group, "W3C Spatial Data on the Web Working Group", https://www.w3.org/2015/spatial/wiki/Main_Page. (*) [oldSSN] W3C, "Semantic Sensor Network (SSN) Ontology", http://www.w3.org/2005/Incubator/ssn/ssnx/ssn.html.W3C Semantic Sensor Network Incubator Group, "Semantic Sensor Network XG Final Report", http://www.w3.org/2005/Incubator/ssn/XGR-ssn-20110628/. (*) [Unity] Unity Gaming Platform (http://unity3d.com/), Unity's Gaming Modeling Architecture Manual http://docs.unity3d.com/Manual/index.html (*) [XDF] C. Kothe and C. Brunner, "XDF (Extensible Data Format)", https://code.google.com/p/xdf/
https://w3id.org/BCI-ontology#
Actuation: Automated Wheelchair: The following use case presents an example that depicts how to define related BCI-O actuation concepts. Wheelchair driving scenario: (*) Purpose: use an actuator capable to control a wheelchair based on the input from a BCI/EEG record (obtained directly from the subject's head). (*) Description: Alice is driving a wheelchair throughout a human interface composed of three major components: (*) An EEG sensor capable of reading brain signals. (*) A computing system capable to process and analyze (classify) the brain signals collected from the EEG sensor. (*) An actuator capable to control the wheelchair's movement (such as direction and acceleration) based on the input from (2). The actuator is a device that works in the following way: (*) The processed brain signals issue specific movement commands to the actuator, such as: (*) Command: "slow down" with: (*) Direction: go forward (no change in the direction). (*) Acceleration: -10.5 cm/s2 (change in the speed). (*) The actuator mechanism: (*) Implements the procedure (actuation) to control the wheelchair. (*) It triggers a series of steps aimed to change the wheelchair's state: to decelerate its wheels. The modeled BCI-O concepts involved in this scenario, excluding those from the observation component (except for EEG-Record and EEG-Device), are listed below: <ul style="list-style-type: square; (*) Subject (x1): "Alice" (*) Session (x1): "a situation where the observation (EEG recording) and actuation happened" (*) Activity (x1): "controlling the automated wheelchair" (*) Context (x1): "at home" (*) Context.Scene (x1): "specific indoors situation" (*) EegRecord (x1): "observation of the EEG record that serves as the input of the actuations" (*) EegDevice (x1): "EEG device that made the EEG recordings" (*) Command (x1): "slow down : (EEG record) -- actuators" (*) Actuator (x2): "the devices that peform the actuations" (*) Actuation (x2): "the procedures that change the state of the wheels via actuators" (*) ImpactedProperty (x2): "the speed of the wheels (their state)" (*) ActuationEvent (x2): "reduce the speed of wheels". (*) ActuationResult (x2): "slowing down" ("the effect of decelerating the wheels"). (*) ActuationTarget (x2): "rear wheels" (x2). (*) Context.Object (composite) (x1): "wheelchair". (*) Context.Method (x1): "deacceleration of a wheel". <DCMIType-StillImage>05-UserCase-Actuation.jpg</DCMIType-StillImage> An RDF file containing a graph corresponding to this example is available. <div class="example <div class="example-title marker Actuation: Automated Wheelchair</div> <pre class="hljs xml" aria-busy="false" aria-live="polite @prefix : http://example.org/data/ . @prefix rdf: http://www.w3.org/1999/02/22-rdf-syntax-ns# . @prefix rdfs: http://www.w3.org/2000/01/rdf-schema# . @prefix schema: http://schema.org/. @prefix time: http://www.w3.org/2006/time#. @prefix sosa: http://www.w3.org/ns/sosa/ . @prefix ssn: http://www.w3.org/ns/ssn/ . @prefix bci: https://w3id.org/BCI-ontology# . @prefix xsd: http://www.w3.org/2001/XMLSchema# . @prefix qudt-1-1: http://qudt.org/1.1/schema/qudt# . @prefix qudt-unit-1-1: http://qudt.org/1.1/vocab/unit# . @base http://example.org/data/ . # Alice performs a "controlling-wheelchair" activity "at home" (scene labeled as: "indoors-X3" #4). # The context scene indoors-X3 #4 has the wheelchair as part of its structure composition. Alice rdf:type bci:Subject . controlling-wheelchair rdf:type bci:Activity . context/at-home/6 rdf:type bci:Context ; bci:hasScene scene/indoors-X3/4 . scene/indoors-X3/4 rdf:type bci:Context.Scene ; bci:hasObject wheelchair . # The wheelchair has a left-rear-wheel (#47) and a right-rear-wheel (#39), which are the actuations targets. # All of them are context objects. wheelchair rdf:type bci:Context.Object ; bci:hasObject left-rear-wheel/47 ; # bci:ActuationTarget defined below bci:hasObject right-rear-wheel/39 . # bci:ActuationTarget defined below # The session is titled "an actuation example". # The session observes an EEG-Record and covers 2 actuations (one for each rear wheel). session/91 rdf:type bci:Session ; bci:hasTitle "an actuation example" ; bci:isSessionOf Alice, context/at-home/6 ; bci:hasActivity controlling-wheelchair ; bci:hasRecord EegRecord/46 ; bci:hasActuation actuation/62, actuation/63 . # Alice's EEG-Record is observed by EEG-device ctnp-A128 #5. # This is the input for the command to "slow down" #11 that initiates the execution of the actuators. # All the details about the recordings' data and results are not shown in this graph. EegRecord/46 rdf:type bci:EegRecord ; bci:observedByDevice ctnp-A128/5 ; bci:isInputFor cmd/slowDown/11 . # EegDevice ctnp-A128 #5 observes the EEG recordings of Alice. ctnp-A128/5 rdf:type bci:EegDevice ; bci:madeRecord EegRecord/46 . # The SlowDown command defines two associated values: # - direction: in this scenario, its value is "forward" (implies no change in this state's axis). # - acceleration: in this scenario, its value is -10.5 cm/s2 (implies a change in this state's axis). # The SlowDown command #11 gives to the actuators servo4WC-ABC #1 and #2, their entry point for execution, # based on the input received from the EEG recordings #46. :SlowDown rdfs:subClassOf bci:Command . :direction a owl:ObjectProperty ; schema:domainIncludes :SlowDown ; schema:rangeIncludes rdfs:Literal . :acceleration a owl:ObjectProperty ; schema:domainIncludes :SlowDown ; # bci:Command schema:domainIncludes :slowingDown ; # bci:ActuationResult schema:domainIncludes :accelerateWheel ; # bci:Context.Method schema:rangeIncludes qudt-1-1:QuantityValue . _:acceleration-value rdf:type qudt-1-1:QuantityValue ; qudt-1-1:numericValue "-10.5"^^xsd:double ; # deacceleration qudt-1-1:unit qudt-unit-1-1:CentimeterPerSecondSquared . cmd/slowDown/11 rdf:type :SlowDown ; :direction "forward" ; :acceleration _:acceleration-value ; bci:consumesInputFrom EegRecord/46 ; bci:isExecutedBy servo4WC-ABC/1, servo4WC-ABC/2 . # servo4WC-ABC #1 made actuation #62, and servo4WC-ABC #2 made actuation #63: # both execute the command to "slow down" #11. # The model says that: # - servo4WC-ABC/1 is designed to automatically change the speed of the right rear wheel. # - servo4WC-ABC/2 is designed to automatically change the speed of the left rear wheel. # Each actuator triggers an event to reduce the speed of the wheel that it is bound to. servo4WC-ABC/1 rdf:type bci:Actuator ; sosa:madeActuation actuation/62 ; ssn:forProperty right-rear-wheel/39#speed ; bci:triggers ae/reduceSpeed/81 . servo4WC-ABC/2 rdf:type bci:Actuator ; sosa:madeActuation actuation/63 ; ssn:forProperty left-rear-wheel/47#speed ; bci:triggers ae/reduceSpeed/82 . # The rear wheels are the actuation targets (for each correspondent actuation procedure). # Each wheel's speed state is an ImpactedProperty. The model allows to explicitly say that: # - left-rear-wheel/47#speed is a property of left-rear-wheel/47 # - right-rear-wheel/39#speed is a property of right-rear-wheel/39 left-rear-wheel/47 rdf:type bci:ActuationTarget ; ssn:hasProperty left-rear-wheel/47#speed . right-rear-wheel/39 rdf:type bci:ActuationTarget ; ssn:hasProperty right-rear-wheel/39#speed . left-rear-wheel/47#speed rdf:type bci:ImpactedProperty ; sosa:isActedOnBy actuation/63 . right-rear-wheel/39#speed rdf:type bci:ImpactedProperty ; sosa:isActedOnBy actuation/62 . # Actuation #62 acted on the state (speed) of right-rear-wheel #39, # and returned "slowing down" #788 as its associated result. # Actuation #63 acted on the state (speed) of left-rear-wheel #47, # and returned "slowing down" #789 as its associated result. # Each actuation has a timestamp for its associated result. actuation/62 rdf:type bci:Actuation ; sosa:actsOnProperty right-rear-wheel/39#speed ; sosa:actuationMadeBy servo4WC-ABC/1 ; sosa:hasResult ar/slowingDown/788 ; sosa:phenomenonTime _:actuation-time ; sosa:resultTime "2018-05-06T20:05:13+00:00"^^xsd:dateTimeStamp . actuation/63 rdf:type bci:Actuation ; sosa:actsOnProperty left-rear-wheel/47#speed ; sosa:actuationMadeBy servo4WC-ABC/2 ; sosa:hasResult ar/slowingDown/789 ; sosa:phenomenonTime _:actuation-time ; sosa:resultTime "2018-05-06T20:05:13+00:00"^^xsd:dateTimeStamp . # The time interval of the actuations: _:actuation-time rdf:type time:Interval ; time:hasBeginning [ rdf:type time:Instant ; time:inXSDDateTimeStamp "2018-05-06T20:05:12+58:00"^^xsd:dateTimeStamp ] ; time:hasEnd [ rdf:type time:Instant ; time:inXSDDateTimeStamp "2018-05-06T20:05:13+02:00"^^xsd:dateTimeStamp ] . # The slowingDown actuation result defines a value for the acceleration (change of speed): see above... # // :acceleration schema:domainIncludes :slowingDown ; // :slowingDown rdfs:subClassOf bci:ActuationResult . # The actuation results of "slowing down" #788 and #789, are defined in the following way: # * Both "slowing down" #788 and #789 are the actuation results of actuations #62 and #63 respectively. # * They have the associated values: # - change of speed: the deacceleration of 10.5 cm/s2. # * Both "slowing down" #788 and #789 involve the event of "reduce speed" #81 and #82 respectively. ar/slowingDown/788 rdf:type :slowingDown ; :acceleration _:acceleration-value ; bci:involves ae/reduceSpeed/81 . ar/slowingDown/789 rdf:type :slowingDown ; :acceleration _:acceleration-value ; bci:involves ae/reduceSpeed/82 . # The actuation events (#81 and #83) change the state of the actuation targets (the rear wheels), # throught the "accelerateWheel" (context method) deacceleration effectuation, which handles the acceleration-value. # The "accelerateWheel" context method defines a "parameter" value: see above... # // :acceleration schema:domainIncludes :accelerateWheel ; // :accelerateWheel rdfs:subClassOf bci:Context.Method . deacceleration rdf:type :accelerateWheel ; :acceleration _:acceleration-value . ae/reduceSpeed/81 rdf:type bci:ActuationEvent ; bci:effectuates deacceleration ; bci:changes right-rear-wheel/39 . ae/reduceSpeed/82 rdf:type bci:ActuationEvent ; bci:effectuates deacceleration ; bci:changes left-rear-wheel/47 . </pre></div>
2014-03-27
Regarding Aspect and Modality: The importance and relationship between the concepts sosa:FeatureOfInterest (superclass of Aspect) and sosa:ObservableProperty (superclass of Modality) are shown and explained in ([oldSSN]): (*) (5.4.4 Agriculture Meteorology Sensor Network -- 5.4.4.1.1 Sensor selection), (*) (5.4.4 Agriculture Meteorology Sensor Network -- 5.4.4.2.2 CF (Climate and Forecast) ontologies), (*) (5.4.3 Wind sensor (WM30) -- 5.4.3.6 Wind Feature and properties), (*) (Wind Sensor example -- Wind Sensor example: Feature of interest). As a SSN domain application ontology, these concepts are defined and adjusted properly for describing the nature of BCI activities observations.
Regarding EEG concepts: This ontology EEG concepts: (*) EegModality. (*) EegNonChannel. (*) EegDevice. (*) EegRecord. (*) EegChannel. If necessary, BCI applications may define a set of restrictions and specialized connections (subproperties) for the relations among the EEG concepts.
2014-03-27
https://w3id.org/BCI-ontology#
http://www.essepuntato.it/lode/owlapi/https://w3id.org/BCI-ontology# (visualise the BCI-ontology with LODE)
The BCI ontology specifies a foundational metadata model set for real-world multimodal Brain-Computer Interaction (BCI) data capture activities.Its structure depicts a conceptual framework that BCI applications can extend and use in their implementations, to define core concepts that capture a relevant and interoperable metadata vocabulary. This ontology is aligned to the Semantic Sensor Network Ontology (SSN): a domain-independent and end-to-end model for sensor/actuator applications. Hence, its structure has been normalized to assist its use in conjunction with other ontologies or linked data resources to specify any particular definitions (such as units of measurement, time and time series, and location and mobility), that specialized applications in the BCI domain might need. Also, this spec provides general alignment data modeling guidelines for core concepts, to help BCI applications in their design.
Regarding the treatment of measurement units: This ontology leaves open to BCI applications, the way how they should handle the semantic expressiveness level of measurement units. In general, there are two possible ways (based on their data requirements): (*) data type properties (with its previously-known units of measurements), implies that it's not necessary to incorporate into their ontology's definition a semantic structure to describe properly units of measurement. (*) For semantic-centric applications: it's necessary to incorporate into their ontology's definition a structure to describe properly units of measurement, depending on their required semantic level of expressiveness. The vast majority of BCI applications are heavily data-centric. Well-known measurement units for a wide range of metadata attributes are used in different specs (such as [XDF] and [ESS]), e.g., pixels, mm, degrees and microvolts. For them, defining a relevant data type property set without specifying measurement units is suffice. For BCI applications that require a proper semantic expressiveness level of measurement units, this ontology provides the following guideline: (*) The BCI concepts that are subject to be extended are those related to Device and Record (including the channeling spec definitions). (*) From the perspective of the BCI ontology alignment to the Stimulus-Sensor-Observation Ontology Design Pattern, the core SSN concepts that "map" to quantities (and, therefore, to units of measurement) are ssn-system:SystemCapability and ssn-system:SystemProperty. Thus, BCI applications should pay special attention to extend the semantic structure of the concepts: (*) Channel, and (*) NonChannel. (*) BCI applications should extend the BCI ontology based on the [oldSSN] guidelines, explained in: (*) (5.3.10 Data -- 5.3.10.2 How to attach a data value to a property?), (*) (5.3.13 Energy -- 5.3.13.3 How to represent a WSN node with information about its energy consumption), (*) (5.4.2 Smart product example -- 5.4.2.2 Sensor), (*) (5.4.2 Smart product example -- 5.4.2.3 Measurement capabilities), (*) (5.4.2 Smart product example -- 5.4.2.4 Observation), (*) (5.4.4 Agriculture Meteorology Sensor Network -- 5.4.4.1.3 Sensor view), (*) (5.4.4 Agriculture Meteorology Sensor Network -- 5.4.4.2.3 Units of measurement and quantity ontologies). (*) It's recommended to incorporate the semantic extensions through their alignment to a proper ontology for units of measurement. Recommended ontologies for units of measurement are: (*) QUDT - Quantities, Units, Dimensions and Data Types Ontologies, developed by the NExIOM project (NASA, TopQuadrant). (*) Ontology of units of Measure (OM): om-1.8.2, developed by a team of researchers at Wageningen UR (wurvoc.org).
Observation Context: Visually Evoked Potential (VEP) Virtual Environment Navigation for Glaucoma Patients: The following use case presents an example that depicts how to define related BCI-O observation and context concepts. Observation Context: Visually Evoked Potential (VEP) Virtual Environment Navigation for Glaucoma Patients: (*) Purpose: to present flickering visual stimuli on a subset of objects in a virtual environment and determining whether the subject suffers vision loss at a particular region based on the EEG responses that he has on those stimuli. (*) Description: Bob is navigating towards a pillar (target) located at the center of the field whilst avoiding obstacles randomly placed on the path in-between. A subject suffering from glaucoma disease may not be able to see objects at a certain sector of his peripheral vision and hence might unknowingly hit upon those unseen objects. The major components in this scenario are: (*) An EEG sensor capable of capturing brain signals using certain electrode placement. (*) A computing system capable to process and analyze (classify) the brain signals collected from the EEG sensor. (*) A gameplay recording mechanism to record the pathway that the subject took to reach the target. (*) Event generation mechanism whenever the subject interacts with an object in the field or whenever a flickering stimulus is presented on his field of vision. The context (virtual environment) consists of the following objects: (*) An assortment of objects with different sizes typically found on a forest is randomly placed throughout the environment, e.g. stones, trees and pit holes. (*) A pillar (target) is placed in the middle of the field and flicker at an unnoticeably high frequency, e.g. 45 Hz. (*) A selection of objects will flicker at a high frequency that is lesser than the target frequency, e.g. 30 Hz - 40 Hz. (*) High flickering frequency is necessary in order to minimize uneasiness felt by the subjects'. Several events are generated whenever: (*) The subject hit upon an object (event data: location of the object). (*) The flickering object appears within the subject's field of vision (event data: flickering frequency). (*) The subject reaches the target. A 3D animation of this virtual environment is available. Experiment flow: (*) The objective is for the subject to walk towards the middle of the field without hitting on any objects placed on the fields. (*) Events are triggered and recorded whenever certain conditions are met. (*) A pathway that the subject took to reach the target, the location of the objects that he hit (if any), is recorded upon. Analysis mechanism: (*) These flickering objects will serve as an indicator as to whether the subject actually sees them within his peripheral vision field. They will induce an EEG response matching the flickering frequency if a subject actually sees them. (*) The event markers will aid the analyst to do selective epoch analysis based on the interests of the analysis, e.g. epochs in which the flickering objects is within a subject's field of vision, or epochs in which he hit an object. <DCMIType-StillImage>05-UserCase-Observation-Context.jpg</DCMIType-StillImage> An RDF file containing a graph corresponding to this example is available. <div class="example <div class="example-title marker Observation Context: Visually Evoked Potential (VEP) Virtual Environment Navigation for Glaucoma Patients</div> <pre class="hljs xml" aria-busy="false" aria-live="polite @prefix : http://example.org/data/ . @prefix rdf: http://www.w3.org/1999/02/22-rdf-syntax-ns# . @prefix rdfs: http://www.w3.org/2000/01/rdf-schema# . @prefix schema: http://schema.org/. @prefix time: http://www.w3.org/2006/time#. @prefix sosa: http://www.w3.org/ns/sosa/ . @prefix ssn: http://www.w3.org/ns/ssn/ . @prefix bci: https://w3id.org/BCI-ontology# . @prefix xsd: http://www.w3.org/2001/XMLSchema# . @prefix qudt-1-1: http://qudt.org/1.1/schema/qudt# . @prefix qudt-unit-1-1: http://qudt.org/1.1/vocab/unit# . @base http://example.org/data/ . # The 3 main sets of definitions: # (i) Virtual Environment (Context): objects, behaviour, events (stimuli). # (ii) Subject, Actions # (iii) Session, EEG Record, EEG sensor (channeling scheme). # Bob navegates in (interacts with) a virtual environment (context), labeled as ven4gp #1. # The context scene forest #1 depicts the architectural design of the objects' layout as part of its structural composition. # The session's activity is labeled "glaucoma-tracking #1". Bob rdf:type bci:Subject . glaucoma-tracking/1 rdf:type bci:Activity . context/ven4gp/1 rdf:type bci:Context ; bci:hasScene scene/forest/1 . scene/forest/1 rdf:type bci:Context.Scene ; bci:hasObject # Objects without flickering behavior: static-stone/1, static-stone/2, static-stone/3, static-stone/4, # ... associate here all the "static stones" in the scene. static-tree/1, static-tree/2, static-tree/3, static-tree/4, # ... associate here all the "static trees" in the scene. static-pit-hole/1, static-pit-hole/2, static-pit-hole/3, # ... associate here all the "static pit-holes" in the scene. # Objects with flickering behavior: flickering-stone/1, flickering-stone/2, flickering-stone/3, # ... associate here all the "flickering stones" in the scene. flickering-tree/1, flickering-tree/2, flickering-tree/3, flickering-tree/4, # ... associate here all the "flickering trees" in the scene. flickering-pit-hole/1, flickering-pit-hole/2, flickering-pit-hole/3, # ... associate here all the "flickering pit-holes" in the scene. # Target: pillar target/pillar/1 . # Below (1~9) are the main structural and functional component definitions for the virtual environment. # (1) The LocatedObject context object: defines 3 associated values that represents the object's location (coordinates). # - X-coor: coordinate in the X axis. # - Y-coor: coordinate in the Y axis. # - Z-coor: coordinate in the Z axis. # This class gives the "location" notion to all the participant objects in the scene. :LocatedObject rdfs:subClassOf bci:Context.Object . :X-coor a owl:ObjectProperty ; schema:domainIncludes :LocatedObject ; schema:rangeIncludes qudt-1-1:QuantityValue . :Y-coor a owl:ObjectProperty ; schema:domainIncludes :LocatedObject ; schema:rangeIncludes qudt-1-1:QuantityValue . :Z-coor a owl:ObjectProperty ; schema:domainIncludes :LocatedObject ; schema:rangeIncludes qudt-1-1:QuantityValue . # (2) The context objects that defines a flickering (and non-flickering) behavior: # - NonFlickeringObject: doesn't have any flickering mechanism (context method). # - LowerFrequencyFlickeringObject: has a constant flickering mechanism (context method) --30Hz--. # - TargetFrequencyFlickeringObject: has a constant flickering mechanism (context method) --45Hz--. :NonFlickeringObject rdfs:subClassOf bci:Context.Object . :LowerFrequencyFlickeringObject rdfs:subClassOf bci:Context.Object . :TargetFrequencyFlickeringObject rdfs:subClassOf bci:Context.Object . # (3) The located context object types of different nature that defines the notions of stones, trees and pit holes. :Stone rdfs:subClassOf :LocatedObject . :Tree rdfs:subClassOf :LocatedObject . :Pit-hole rdfs:subClassOf :LocatedObject . # (4) The classes of static objects that don't flicker. :StaticStone rdfs:subClassOf :NonFlickeringObject, :Stone . :StaticTree rdfs:subClassOf :NonFlickeringObject, :Tree . :StaticPit-hole rdfs:subClassOf :NonFlickeringObject, :Pit-hole . # (5) The classes of flickering objects that are not the target (Pillar). :FlickeringStone rdfs:subClassOf :LowerFrequencyFlickeringObject, :Stone . :FlickeringTree rdfs:subClassOf :LowerFrequencyFlickeringObject, :Tree . :FlickeringPit-hole rdfs:subClassOf :LowerFrequencyFlickeringObject, :Pit-hole . # (6) The target (Pillar). :Pillar rdfs:subClassOf :TargetFrequencyFlickeringObject, :LocatedObject . target/pillar/1 rdf:type :Pillar . # (7) The flickering mechanisms (types of context methods): # - flickeringAtLowerFrequency: a constant flickering mechanism of 30 Hz. # - flickeringAtTargetFrequency: a constant flickering mechanism of 45 Hz. # Both are bound to their correspondent objects. :hasFrequency a owl:ObjectProperty ; schema:domainIncludes bci:Context.Method ; schema:rangeIncludes qudt-1-1:QuantityValue . _:lower-freq-value rdf:type qudt-1-1:QuantityValue ; qudt-1-1:numericValue "30.0"^^xsd:double ; qudt-1-1:unit qudt-unit-1-1:Hertz . _:target-freq-value rdf:type qudt-1-1:QuantityValue ; qudt-1-1:numericValue "45.0"^^xsd:double ; qudt-1-1:unit qudt-unit-1-1:Hertz . flickeringAtLowerFrequency rdf:type bci:Context.Method ; bci:definesBehaviorOf :LowerFrequencyFlickeringObject ; :hasFrequency _:lower-freq-value . # a constant value flickeringAtTargetFrequency rdf:type bci:Context.Method ; bci:definesBehaviorOf :TargetFrequencyFlickeringObject ; :hasFrequency _:target-freq-value . # a constant value # (8) The stimuli events notion associated to the flickering methods. flickeringEvent rdf:type bci:Context.Event ; bci:effectuates flickeringAtLowerFrequency, flickeringAtTargetFrequency . # (9) The subject events (actions) and capability. walk rdf:type bci:Context.Capability . Bob bci:canPerform walk . :Action.HitObject rdfs:subClassOf bci:Action . :Action.DetectFlickeringObject rdfs:subClassOf bci:Action . :Action.ReachTarget rdfs:subClassOf bci:Action . # Examples on how Bob issues his related actions: # Bob bci:issues action/hit-object/344 # Bob bci:issues action/detect-flickering/96 # Below is an excerpt of the definitions for the assortment of objects that participate in the scene: static-stone/1 rdf:type :StaticStone ; :X-coor [ rdf:type qudt-1-1:QuantityValue ; qudt-1-1:numericValue "6.75"^^xsd:double ; qudt-1-1:unit qudt-unit-1-1:Meter ] ; :Y-coor [ rdf:type qudt-1-1:QuantityValue ; qudt-1-1:numericValue "2.15"^^xsd:double ; qudt-1-1:unit qudt-unit-1-1:Meter ] ; :Z-coor [ rdf:type qudt-1-1:QuantityValue ; qudt-1-1:numericValue "0.0"^^xsd:double ; qudt-1-1:unit qudt-unit-1-1:Meter ] . static-stone/2 rdf:type :StaticStone . # + X, Y, Z coordinates ; # static-stone/3 ... , static-stone/4 ... static-tree/1 rdf:type :StaticTree . # + X, Y, Z coordinates ; # static-tree/2 ... , static-tree/3 ... , static-tree/4 ... static-pit-hole/1 rdf:type :StaticPit-hole . # + X, Y, Z coordinates ; # static-pit-hole/2 ... , static-pit-hole/3 ... flickering-stone/1 rdf:type :FlickeringStone ; :X-coor [ rdf:type qudt-1-1:QuantityValue ; qudt-1-1:numericValue "5.05"^^xsd:double ; qudt-1-1:unit qudt-unit-1-1:Meter ] ; :Y-coor [ rdf:type qudt-1-1:QuantityValue ; qudt-1-1:numericValue "12.37"^^xsd:double ; qudt-1-1:unit qudt-unit-1-1:Meter ] ; :Z-coor [ rdf:type qudt-1-1:QuantityValue ; qudt-1-1:numericValue "0.0"^^xsd:double ; qudt-1-1:unit qudt-unit-1-1:Meter ] . flickering-stone/2 rdf:type :FlickeringStone . # + X, Y, Z coordinates ; # flickering-stone/3 ... flickering-tree/1 rdf:type :FlickeringTree . # + X, Y, Z coordinates ; # flickering-tree/2, flickering-tree/3 ... flickering-pit-hole/1 rdf:type :FlickeringPit-hole . # + X, Y, Z coordinates ; # flickering-pit-hole/2 ... , flickering-pit-hole/3 ... target/pillar/1 rdf:type :Pillar . # + X, Y, Z coordinates ; location: in the middle of the field. # The session observes an EEG-Record with its related settings. session/9 rdf:type bci:Session ; bci:hasTitle "Visually Evoked Potential (VEP) Virtual Environment Navigation for Glaucoma Patients" ; bci:isSessionOf Bob, context/ven4gp/1 ; bci:hasActivity glaucoma-tracking/1 ; bci:hasRecord EegRecord/18 . # Bob's EEG-Record is observed by the EEG-device eeg-dev #3. # The EEG observation is associated to the following metadata: # - the data recordings (result: RecordedData). # - the channeling schema used to collect the brain signals (RecordChannelingSpec). # - a time interval for its duration. # - a timestamp for its associated result. EegRecord/18 rdf:type bci:EegRecord ; bci:observedByDevice eeg-dev/3 ; bci:hasRecordChannelingSpec channeling-spec/18 ; bci:observationResult data-file/2 ; bci:aspectOfInterest vision-field-sensitivity ; bci:observedModality mfSSVEP ; ssn:wasOriginatedBy flickeringAtLowerFrequency, flickeringAtTargetFrequency ; sosa:phenomenonTime _:observation-time ; sosa:resultTime "2018-05-28T16:42:51+00:00"^^xsd:dateTimeStamp . # The time interval of the observation: _:observation-time rdf:type time:Interval ; time:hasBeginning [ rdf:type time:Instant ; time:inXSDDateTimeStamp "2018-05-28T16:40:08+00:00"^^xsd:dateTimeStamp ] ; time:hasEnd [ rdf:type time:Instant ; time:inXSDDateTimeStamp "2018-05-28T16:42:48+00:00"^^xsd:dateTimeStamp ] . # The core related descriptions about the BCI descriptive features are shown below. # EegDevice eeg-dev #3 observes the EEG recordings of Bob. eeg-dev/3 rdf:type bci:EegDevice ; bci:observes mfSSVEP ; bci:madeRecord EegRecord/18 . # The data recordings data-file #2 is the result of the observation. data-file/2 rdf:type bci:RecordedData ; bci:isProducedByDevice eeg-dev/3 . # The channeling scheme spec used in the observation. channeling-spec/18 rdf:type bci:RecordChannelingSpec ; bci:hasChannelData eeg-channel/1, eeg-channel/2, eeg-channel/3 . # ... associate here all the channels set for the recording. # The channels definitions used in the recordings. eeg-channel/1 rdf:type bci:EegChannel . # ... insert in here the channel's settings. eeg-channel/2 rdf:type bci:EegChannel . # ... insert in here the channel's settings. eeg-channel/3 rdf:type bci:EegChannel . # ... insert in here the channel's settings. # ... for all the defined channels. # The aspect and modality of the observation. vision-field-sensitivity rdf:type bci:NeurologicalAspect ; bci:hasModality eeg-dev/3 . mfSSVEP rdf:type bci:EegModality ; bci:hasChannelingSpec channeling-spec/18 . </pre></div>
01-core-structure.png <show/>
Sergio José Rodríguez Méndez. Pervasive Embedded Technologies Laboratory (PET Lab), Computer Science Department, NCTU, Taiwan. John K. Zao. Pervasive Embedded Technologies Laboratory (PET Lab), Computer Science Department, NCTU, Taiwan and CerebraTek, Taiwan.
01-core-interaction-model(subject-context).png <show/>
BCI Ontology
Regarding the Procedure concept: [SSN] defines a general concept about Procedures, which encompasses any kind of Observations and Actuations. Thus, it fits properly the domain of BCI data capture activities. Following ontology engineering good practices, and given that there is not a specific description of Procedures for BCI data capture activities, this ontology does not define any new concept for them. Therefore, BCI-O applications that needs to model "Procedure" into their metadata definitions, should align directly to the concept sosa:Procedure. It is worth noting that this practice applies to any other high-level concepts that BCI-O applications might include into their vocabulary.
0.9.5,0.9.4,0.9.3,0.9.2,0.9.1,0.8.9,0.7.5,0.6.1
Brain-Computer Interaction (BCI) Ontology
https://prezi.com/embed/cfcvd0wnx1uy/?bgcolor=fffffflock_to_path=0autoplay=0autohide_ctrls=0landing_data=bHVZZmNaNDBIWnNjdEVENDRhZDFNZGNIUE43MHdLNWpsdFJLb2ZHanI5a2dPYjZRcFFhczZBQUNvMWVwa0g4ajV3PT0landing_sign=1qEsK_JTVQmufpQJeDQUVGGXF87qe1jYl0MsFelGZww
Mappings to SAN: The Actuation Model of BCI-O was developed based on the following premises: (*) Aims to integrate and reconcile the SOSA axioms [SSN] and SAN axioms [SAN] for modeling actuations and actuators. (*) Follows closely the proposed Actuation-Actuator-Effect (AAE) design pattern [AAE]: a core model for the IoT Ontology (IoT-O). The following diagram depicts the BCI-O alignment to SAN. <DCMIType-StillImage>06-Actuation-Model-Alignment-to-SAN.jpg</DCMIType-StillImage> As a broad application domain ontology for BCI activities, BCI-O integrates and refines some modeling considerations of the SOSA and SAN concepts regarding actuations and actuators. One example is the ActuationEvent alignment to san:Effect (or san:ActuatorOutput): (*) A san:Effect defines any kind of physical modification (an effect on the physical world -- Context ) induced by an actuator (a characteristic of its nature, as an agent that has an effect on the physical world). (*) An ActuationEvent is a Context.Event triggered by an Actuator that changes the state of the ActuationTarget (which is a Context.Object). Another inherited modeling perspective for BCI, comes from the definition of san:impacts object property: [san:Effect] -------- (san:impacts) -------- [oldssn:Property] The BCI-O alignment to SAN allows the following inferred relationship: [ActuationEvent] ---- (san:impacts) ---- [ImpactedProperty] The SOSA-SAN integrated Actuation Model of BCI-O represents a major contribution to the IoT and BCI communities.
02-complete-UML-diagram.png <show/>
ESS+HED Standards Ontology for BCI-O: As the first BCI-O extension for the industry, this ontology specifies the relevant metadata vocabulary for BCI data capture activities for the ESS+HED Standards. Its main purpose is to provide a simple and compatible BCI-O based ontology for the ESS+HED Standards. Its spec is published in <a title="ESS+HED Standards Ontology for BCI-O http://bci.pet.cs.nctu.edu.tw/ontology?ESS_HED.owl Its representational RDF graph, along with its alignments to BCI-O, is depicted in the following model: <DCMIType-StillImage>01-esshed-RDF-graph-diagram.png</DCMIType-StillImage>
CerebraTek Pod ontology (for an mfSSVEP visual stimuli pattern): As one of the earliest direct applications of BCI-O, this ontology specifies the relevant metadata vocabulary for BCI data capture activities using the CerebraTek Pod devices applied to glaucoma diagnostics using mfSSVEP (Steady-State Visually Evoked Potential with Vision Field Sensitivity). Its spec is published in http://bci.pet.cs.nctu.edu.tw/ontology?cerebratek_nupod.owl Its representational RDF graph, along with its alignments to BCI-O, is depicted in the following model: <DCMIType-StillImage>01-ctnp-RDF-graph-diagram.png</DCMIType-StillImage>
2018-06-07T23:54:36
03-webvowl-preview.png
Copyright 2014 - 2018 PET Lab, Computer Science Department, NCTU, Taiwan.
0.9.6
The BCI ontology (BCI-O) provides a high level semantic structure specialized metadata vocabulary set for real-world multimodal BCI data capture activities. It defines a minimalist and simple abstract metadata foundational model for real-world BCI applications that monitors human activity in any scenario. BCI multimodal domain applications are encouraged to extend and use this ontology in their implementations. BCI-O was developed following W3C Semantic Web ontology standards and guidelines, so that BCI applications can express reusable, interoperable and extendable machine-readable BCI metadata models, especially in pervasive M2M environments. For this purpose, its design was aligned to the Semantic Sensor Network Ontology (SSN), following closely its Stimulus-Sensor-Observation Ontology Design Pattern. The core set of relevant metadata definitions for real-world BCI activities were taken from different proposed vocabularies and formats in the BCI domain, such as: XDF, ESS and HED. BCI-O concepts are logically grouped into 11 modules. Each module represents a central topic of the ontology structure where the related concepts give a consistent explanation about its functional data model. The modules are: (*) Subject: concepts related to the depiction of a human being (or human subject) engaging in an activity and its associate state. (*) Context: captures the architectural description of an environment (or context). A human being interacts with a context. (*) Session: represents the interaction between a subject and a context while performing a single activity, under specific settings and conditions. (*) Observations (was SSN-Skeleton): specific concepts for BCI activities aligned to the SOSA/SSN axioms for modeling Observations (the initial alignment was to the Skeleton of [oldSSN]). Metadata related to records, modality types (such as EEG), channeling information, output streams (file formats and access) and stimulus events, are found in this module. (*) Sensors (was SSN-Device): specific related concepts for BCI activities aligned to the SOSA/SSN axioms for modeling Sensors (under Observations) (the initial alignment was to the Device module of [oldSSN]). Metadata related to devices and their channeling specification are found in this module. (*) System Capabilities (was SSN-MeasurementCapability): specific related concepts for BCI activities aligned to the SSN horizontal segmentation module for System Capabilities (the initial alignment was to the Measurement Capability module of [oldSSN]). Metadata related to channels and other measurement properties are found in this module. (*) Results (was SSN-Data): specific related concepts for BCI activities aligned to the SOSA axioms for modeling Results (the initial alignment was to the Data module of [oldSSN]). Metadata related to data blocks, recorded data, and actuation results are defined in this module. (*) data tagging). (*) Actuation: specific related concepts for BCI activities aligned to the SOSA axioms and SAN axioms for modeling Actuations. Similarly as described in [Seydoux2016], this module depicts how a subject can interact with the physical/virtual world (context) in BCI activities. Its main classes, actuator and actuation, are modeled following the Actuation-Actuator-Effect (AAE) design pattern: a core model for the IoT Ontology (IoT-O). (*) Descriptor: a descriptor represent an external resource set that extends the description of entities in the ontology. A descriptor complements the information associated to the relevant metadata set, defined in this ontology. (*) EEG: specific concepts for EEG (Electroencephalography) applications. As one of the ontologies ("concept producers") that reuse SSN, BCI-O was selected as part of the analysis on the usage of SSN. This spec has been registered in the following open repositories: [<caption>Open Repositories where BCI-O can be accessed</caption> [ [ <th style="width: 25%; text-align: center; Repository <th style="width: 50%; text-align: center; Entry <th style="width: 25%; text-align: center; Description ] ] [ [ [w3id.org github] [https://github.com/perma-id/w3id.org/tree/master/BCI-ontology] [ Permanent URI for the WWW] ] [ [Linked Open Vocabularies] [http://lov.okfn.org/dataset/lov/vocabs/bci] [ LOD community] ] [ [BioPortal] [http://bioportal.bioontology.org/ontologies/BCI-O] [ BioMedical community] ] ] ] Some early BCI-O applications are presented at <a class="other the end of the spec.
01-a-overview-modules.png <show/>
04-OWLGrEd-preview.png
bci
The BCI ontology describes a framework of core concepts of the specialized metadata set for multimodal "Brain-Computer Interaction" (BCI) data capture activities. It is being developed by the "Pervasive Embedded Technologies" Laboratory (PET Lab) at the Computer Science Department of the National Chiao Tung University (NCTU), Taiwan (Republic of China, R.O.C). Its concepts and structure depict a foundational metadata model for BCI data capture activities, that BCI applications can extend and use in their implementations. Any feedback is welcome. Please mail it to srodriguez@pet.cs.nctu.edu.tw
Context
Actuation.png
Context.png
2018-04-15T05:35:00
[Unity]
affects
Status: *STABLE*
Connects a Context.Event with a set of related Context.Objects that captures a perspective of their interactions over time.
affects
Connecting a Context.Event with related Context.Objects
AnnotationTag
Model_(SOSA-SSN).png
2018-02-08T01:55:00
analyzes
Status: *STABLE*
Indicates that an Aspect is analyzed by a Model (throughout its States): analyzing looking for specific kind of Markers. A Model is specific to the purpose of its BCI application, such as: stress level measurement or fatigue detection.
analyzes
Connecting a Model with an Aspect.
Observations
Record_(SOSA-SSN).png
2017-08-31T03:19:00
[SSN]
aspectOfInterest
Status: *STABLE*
Connects a Record with its correspondent Aspect. This can be read, as follow: "A Record is generated by capturing an Aspect (of interest)". This object property is a subproperty of sosa:hasFeatureOfInterest: [sosa:Observation] ------ (sosa:hasFeatureOfInterest) ------ [sosa:FeatureOfInterest] [Record] -------------------- (aspectOfInterest) ---------------------- [Aspect]
aspect of interest
Connecting a Record individual with its correspondent Aspect
Context
Context.png
2018-04-15T03:44:00
[Unity]
canPerform
Status: *STABLE*
States that a Context.AutonomousBeing "can perform" a set of Context.Capability-ies.
can perform
Connecting a Context.AutonomousBeing that can perform a set of Context.Capability-ies.
Context
Context.png
2018-04-15T05:35:00
[Unity]
causes
Status: *STABLE*
Connects a Context.Object with a set of related Context.Events that captures a perspective of its interactions over time.
causes
Connecting a Context.Object with related Context.Events
Actuation,Context
Actuation.png
2018-04-15T03:38:00
[SSN]
changes
Status: *STABLE*
Represents the relationship from an ActuationEvent to the thing or object (ActuationTarget) whose property (ImpactedProperty) is being manipulated by an Actuator.
%GENERAL_EXAMPLE%@Actuation-Use-Case
changes
Actuation
Actuation.png
2018-01-10T05:13:00
[Seydoux2016]
consumesInputFrom
Status: *STABLE*
A Command consumes its input from a Record. Following the Actuation-Actuator-Effect (AAE) design pattern proposed for the IoT Ontology (IoT-O), this relationship is based on the following definition: [Actuator]** ------ (consumes) ------ [Input] ** Via the execution of a Command.
%GENERAL_EXAMPLE%@Actuation-Use-Case
consumes input from
Connecting a Command with a Record.
Context
Context.png
2018-04-15T03:44:00
[Unity]
coparticipatesIn
Status: *STABLE*
Connects a Context.Object as a coparticipant in a set of Context.Methods.
coparticipates in
Connecting a Context.Object that coparticipates in a set of Context.Methods.
Context
Context.png
2018-04-15T03:44:00
[Unity]
definesBehaviorOf
Status: *STABLE*
Connects a Context.Method with a set of Context.Objects that models a perspective of their expected behavior.
defines behavior of
Connecting a Context.Method that models the behavior of a some Context.Objects.
Sensors
StimulusEvent_(SOSA-SSN).png
2017-09-14T02:02:00
[SSN]
detects
Status: *STABLE*
Connects a Device with its correspondent StimulusEvent set. This can be read, as follow: "A Device detects StimulusEvent". This object property is a subproperty of ssn:detects: [sosa:Sensor] ------ (ssn:detects) ------ [ssn:Stimulus] [Device] ---------- (detects) -------- [StimulusEvent] [SSN] A relation from a sosa:Sensor to the ssn:Stimulus that the sosa:Sensor can detect.
detects
Connecting a Device individual with its correspondent StimulusEvent set.
Context
Context.png
2018-04-15T03:44:00
[Unity]
effectuates
Status: *STABLE*
Connects a Context.Event with some related Context.Methods as a part of an interaction.
From the perspective of the Object-Oriented Programming paradigm, this relationship captures a set of object messages in a specific time frame: an interaction between Context.Objects through their Context.Methods.
effectuates
Connecting a Context.Event with some Context.Methods.
Actuation
Actuation.png
2018-05-08T18:13:00
[Seydoux2016]
executes
Status: *STABLE*
An Actuator executes a Command. Following the Actuation-Actuator-Effect (AAE) design pattern proposed for the IoT Ontology (IoT-O), this relationship expands the following definition: [Actuator] ------ (consumes) ------ [Input]** ** The Input from a Record via the execution of a Command.
See general remark about: 2_MAPPINGS-TO-SAN
%GENERAL_EXAMPLE%@Actuation-Use-Case
executes
Connecting an Actuator with a Command.
Descriptor,Observations
Aspect-and-Modality_(SOSA-SSN).png
2018-01-15T00:54:00
extends
Status: *STABLE*
Relation between a RecordChannelingSpec and a ChannelingSpec that the observation extends through the associated DeviceChannelingSpec. The object property composition (owl:propertyChainAxiom) ensures that if a DeviceChannelingSpec extends a particular ChannelingSpec, then one can infer that the RecordChannelingSpec also extends that ChannelingSpec. This extended spec of the channeling schema information object: RecordChannelingSpec.extendsDeviceChannelingSpec * DeviceChannelingSpec.extendsChannelingSpec --> RecordChannelingSpec.extends := ChannelingSpec.
extends its related modality channeling schema spec
A hasDescriptor sub property that connects a RecordChannelingSpec with its related ChannelingSpec.
Descriptor,Sensors
Aspect-and-Modality_(SOSA-SSN).png
2016-07-29T03:18:00
extendsChannelingSpec
Status: *STABLE*
Connects a DeviceChannelingSpec with its related ChannelingSpec. This relation states that a DeviceChannelingSpec individual extends its related ChannelingSpec from which was derivated.
extends its modality channeling schema spec
A hasDescriptor sub property that connects a DeviceChannelingSpec with its related ChannelingSpec.
Descriptor,Observations
Aspect-and-Modality_(SOSA-SSN).png
2016-07-29T02:21:00
extendsDeviceChannelingSpec
Status: *STABLE*
Connects a RecordChannelingSpec with its related DeviceChannelingSpec. This relation states that a RecordChannelingSpec individual extends its related DeviceChannelingSpec from which was derivated.
extends its device channeling schema spec
A hasDescriptor sub property that connects a RecordChannelingSpec with its related DeviceChannelingSpec.
SystemCapabilities
Aspect-and-Modality_(SOSA-SSN).png
2017-09-14T04:34:00
[SSN]
forModality
Status: *STABLE*
Connects a Channel to the supported Modality is described for. This can be read, as follow: "A Channel is described for (supports) Modality". This object property is a subproperty of ssn:forProperty: [ssn-system:SystemCapability] ------ (ssn:forProperty) ------ [sosa:ObservableProperty] [Channel] ---------------------- (forModality) ---------------------- [Modality] [SSN] A relation from a ssn-system:SystemCapability to the sosa:ObservableProperty the capability is described for.
See general remark about: EEG-CONCEPTS
for modality
Connecting a Channel to the supported Modality is described for.
Observations
RecordedData_(SOSA-SSN).png
2018-02-10T04:18:00
hasAccessMethod
Status: *STABLE*
Connects a RecordedData with a set of AccessMethods that describes how the data is being accessed by the BCI application.
has BCI data access method
Connecting a RecordedData with its associated AccessMethods
Session,Subject
Activity.png
2018-04-15T18:38:00
hasAction
Status: *STABLE*
Connects an Activity with its correspondent Action set.
has action
Connecting an Activity with its correspondent Action set.
Session
Session.png
2018-02-08T04:10:00
hasActivity
Status: *STABLE*
Connects a Session with its associated Activity.
has activity
Connecting a Session individual with its associated Activity.
Session,Actuation
Actuation.png
2018-02-08T02:07:00
hasActuation
Status: *STABLE*
Connects a Session with a set of Actuations that are associated with it.
%GENERAL_EXAMPLE%@Actuation-Use-Case
has actuation
Connecting a Session with its related set of Actuations
Descriptor,Actuation
Actuation.png
2017-12-11T02:30:00
hasActuatorSpec
Status: *STABLE*
Connects an (Actuator or ActuatorSpec) with its set of related ActuatorSpecs.
has actuator spec
A hasDescriptor sub property for connecting an (Actuator or ActuatorSpec) with its set of ActuatorSpecs.
Sensors
MeasurementCapability_(SOSA-SSN).png
2016-08-14T05:19:00
[XDF]
hasChannelData
Status: *STABLE*
Connects a DeviceChannelingSpec with the set of Channels that comprises its internal structure.
See general remark about: EEG-CONCEPTS
has channel data (logical component)
Connecting a DeviceChannelingSpec with its Channel set.
Observations
Aspect-and-Modality_(SOSA-SSN).png
2016-06-30T01:43:00
hasChannelingSpec
Status: *STABLE*
Connects a Modality with its related ChannelingSpec.
has channeling schema spec
A hasDescriptor sub property for connecting a Modality with its related ChannelingSpec.
Results,AnnotationTag
DataSegment.png
2018-02-08T02:39:44
hasDataBlock
Status: *STABLE*
Connects a (RecordedData or DataSegment) with its correspondent DataBlock set.
has data block set
Connecting a (RecordedData or DataSegment) individual with its correspondent DataBlock set.
Observations
RecordedData_(SOSA-SSN).png
2018-02-10T03:34:32
hasDataFormat
Status: *STABLE*
Connects a RecordedData with its corresponding DataFormat that describes the representation of the data observed by a Device.
has data format
Connecting a RecordedData with its corresponding DataFormat
AnnotationTag,Context,Descriptor,Session,SystemCapabilities,Observations,Subject
Descriptor_(SOSA-SSN).png
2018-02-08T02:43:00
hasDescriptor
Status: *STABLE*
Connects an entity with a set of Descriptors.
has external resource (descriptor)
Connecting an entity with a set of Descriptors.
Sensors
Device_(SOSA-SSN).png
2016-07-19T04:33:00
hasDeviceChannelingSpec
Status: *STABLE*
Connects a Device with its related DeviceChannelingSpec.
has device channeling schema spec
A sub property of hasDescriptor for connecting a Device with its related DeviceChannelingSpec.
Descriptor,Sensors
Device_(SOSA-SSN).png
2016-06-30T01:43:00
[XDF], [ESS]
hasDeviceSpec
Status: *STABLE*
Connects a (Device or DeviceSpec) with its set of related DeviceSpecs.
has device spec
A hasDescriptor sub property for connecting a (Device or DeviceSpec) with its set of DeviceSpecs.
EEG
MeasurementCapability_(SOSA-SSN).png
2016-08-14T05:39:00
[XDF]
hasEegChannelData
--Concerning EEG, this ontology only defines its related classes. It does not extend or define any specific properties for EEG.-- $ 04:27 AM 2016-07-29 $
true
Status: *STABLE*
Connects an EegDeviceChannelingSpec with the set of EegChannels that comprises its internal structure.
See general remark about: EEG-CONCEPTS
has EEG channel data
Connecting an EegDeviceChannelingSpec with its EegChannel set.
EEG
MeasurementCapability_(SOSA-SSN).png
2016-08-10T06:48:00
[SSN]
hasEegNonChannelData
--Concerning EEG, this ontology only defines its related classes. It does not extend or define any specific properties for EEG.-- $ 04:27 AM 2016-07-29 $
true
Status: *STABLE*
[SSN] Relation from a EegDevice to its EegNonChannel describing the non-channeling measurement capabilities (a set of measurement properties) of the EEG BCI device.
See general remark about: EEG-CONCEPTS
This ontology leaves open to BCI applications the way how they should describe properly basic non-channeling measurement capabilities for its relevant set of different classes of sensors (Device class hierarchy) used in BCI activities, following the description of the ssn-system:SystemCapability concept. Based on their system requirements, BCI applications may define a set of restrictions and specialized connections (subproperties) on the property hasNonChannelData (subproperty of ssn-system:hasSystemCapability) for each particular subclass of Device (subclass of sosa:Sensor), which describes sensors for specific types.
has non-channeling EEG data (other EEG measurement capability)
AnnotationTag
Model_(SOSA-SSN).png
Record_(SOSA-SSN).png
2018-01-03T01:13:00
hasFeatureParameter
Status: *STABLE*
A ResponseTag or a Record has a set of FeatureParameters.
has feature
Connecting a ResponseTag or a Record with its correspondent set of FeatureParameters.
Context
2016-08-14T06:22:00
hasLocation
--This ontology will not define a "location" concept of a Context. BCI applications may extend its own ontology to include this definition if necessary.-- $ 06:26 AM 2016-08-14 $
true
Status: *STABLE*
Connects a Context with an entity that represents or describes its location.
has location
Connecting a Context with an entity that represents or describes its location.
Observations
Record_(SOSA-SSN).png
2016-08-30T23:26:00
hasMeasurementProperty
Status: *STABLE*
Connects a Record with a set of ssn-system:SystemProperty-ies (see ssn-system:SystemProperty). Through this relationship, BCI applications may extend the relevant metadata set related to the Record concept.
has SSN system property
Connecting a Record with a ssn-system:SystemProperty set.
Session
Session.png
2018-01-08T04:43:00
hasMember
Status: *STABLE*
Groups a set of Sessions and/or Interactions under a Collection.
has member (groups)
Connecting a Collection with its related set of Sessions and/or Interactions.
Observations
Aspect-and-Modality_(SOSA-SSN).png
2017-08-31T12:21:00
[SSN]
hasModality
Status: *STABLE*
Connects an Aspect with its correspondent Modality set. This can be read, as follow: "An Aspect has Modality(ies)". This object property is a subproperty of ssn:hasProperty: [sosa:FeatureOfInterest] ------ (ssn:hasProperty) ------ [sosa:ObservableProperty] [Aspect] -------------------- (hasModality) ---------------------- [Modality]
has modality
Connecting an Aspect with its correspondent Modality set.
AnnotationTag
Model_(SOSA-SSN).png
2018-01-02T02:37:00
hasModel
Status: *STABLE*
A ResponseTag or FeatureParameter is associated with (has) a Model.
has model
Connecting a ResponseTag or FeatureParameter with its correspondent Model.
Context,Results,Observations
Context.Scene.png
DataBlock_(SOSA-SSN).png
2018-04-15T22:50:00
hasNext
Status: *STABLE*
Connects a (Context.Scene or Context.Event or Record or RecordedData or DataBlock) with its following (next) (Context.Scene or Context.Event or Record or RecordedData or DataBlock) of the sequence.
(*) [Context.Scene] On a Video Game: (Level 3-2) hasNext (Level 3-3). (*) [Context.Event]: links to the following event on a sequence. (eating a meal) hasNext (taking medicine). (*) [Record]: an observation is linked to its following observation. Their difference could be on their channeling settings. (*) [RecordedData]: links to the following data version of the current data set. (*) [DataBlock]: points to the following data unit value from the current one along the sequence.
has next (following)
Connecting a (Context.Scene or Context.Event or Record or RecordedData or DataBlock) with its next correspondent (Context.Scene or Context.Event or Record or RecordedData or DataBlock).
Sensors
MeasurementCapability_(SOSA-SSN).png
2017-08-30T22:44:00
[SSN]
hasNonChannelData
Status: *STABLE*
[SSN] Relation from a Device to a NonChannel describing the non-channeling measurement capabilities (a set of measurement properties) of the BCI device.
See general remark about: 1_MAPPINGS-TO-SOSA-SSN
See general remark about: EEG-CONCEPTS
This ontology leaves open to BCI applications the way how they should describe properly basic non-channeling measurement capabilities for its relevant set of different classes of sensors (Device class hierarchy) used in BCI activities, as explained in [SSN] System Capabilities Module and [oldSSN] MeasuringCapability Module. Based on their system requirements, BCI applications may define a set of restrictions and specialized connections (subproperties) on the property hasNonChannelData (subproperty of ssn-system:hasSystemCapability) for each particular subclass of Device (subclass of sosa:Sensor), which describes sensors for specific types.
has non-channeling data (other BCI measurement capability)
Context
Context.png
2018-02-08T04:16:00
[Unity]
hasObject
Status: *STABLE*
Connects a (Context.Scene or Context.Object) with the set of Context.Objects that comprises its internal structure.
has object
Connecting a (Context.Scene or Context.Object) with its Context.Objects
Context
Context.ObjectComponent.png
2018-03-14T03:35:00
[Unity]
hasObjectComponent
--Following closely the alignment to DUL, the concepts about Objects and Events are distinctly separated. Therefore, from a structural perspective, a Context.ObjectComponent is a Context.Object except for Context.ObjectComponent.Event which is changed to Context.Event. This object property is not being used anymore.--$ 03:21 AM 2018-03-14 $
true
Status: *STABLE*
Connects a (Context.Object or Context.ObjectComponent) with the set of Context.ObjectComponents that comprises its internal structure.
has object component
Connecting a (Context.Object or Context.ObjectComponent) with its Context.ObjectComponents
Context,Session
Playout.png
2018-02-08T03:16:00
hasPlayout
Status: *STABLE*
Connects a (Context or Session) with its set of Playouts.
has playout record
Connecting a (Context or Session) with its Playouts.
Session,Context
Activity.png
PlayoutInstant.png
2018-02-11T03:59:00
hasPlayoutInstant
Status: *STABLE*
Connects a (Playout or Context.Event or Action) with its correspondent PlayoutInstant(ces) log entries.
has playout instant
Connecting a (Playout or Context.Event or Action) individual with its correspondent PlayoutInstant(ces)
Context,Results,Observations
Context.Scene.png
DataBlock_(SOSA-SSN).png
2016-07-07T02:50:00
hasPrevious
Status: *STABLE*
Connects a (Context.Scene or Record or RecordedData or DataBlock) with its previous (Context.Scene or Record or RecordedData or DataBlock) of the sequence.
(*) [Context.Scene] On a Video Game: (Level 3-2) hasPrevious (Level 3-1). (*) [Record]: an observation is linked to its previous observation. Their difference could be on their channeling settings. (*) [RecordedData]: links to the previous data version of the current data set. (*) [DataBlock]: points to the previous data unit value from the current one along the sequence.
has previous (before)
Connecting a (Context.Scene or Record or RecordedData or DataBlock) with its previous (Context.Scene or Record or RecordedData or DataBlock).
Session,Subject
Record_(SOSA-SSN).png
2016-06-24T00:07:00
hasRecord
Status: *STABLE*
Connects a (Subject or Session) with a set of Records that are associated with it.
has BCI record
Connecting a (Subject or Session) with its related set of Records
Observations
Record_(SOSA-SSN).png
2016-07-18T03:24:00
hasRecordChannelingSpec
Status: *STABLE*
Connects a Record with its related RecordChannelingSpec.
has record channeling schema spec
A hasDescriptor sub property for connecting a Record with its related RecordChannelingSpec.
Descriptor,Observations
Record_(SOSA-SSN).png
2016-07-18T03:01:00
[XDF], [ESS]
hasRecordSpec
Status: *STABLE*
Connects a (Record or RecordSpec) with its set of related RecordSpecs.
has record spec
A hasDescriptor sub property for connecting a (Record or RecordSpec) with its set of RecordSpecs.
Context
Context.Role.png
2018-02-08T02:28:00
[Unity]
hasRole
Status: *STABLE*
Connects a Context.Object with its Context.Role.
has role
Context
Context.Scene.png
2018-02-08T03:06:00
hasScene
Status: *STABLE*
Connects a (Context or Context.Scene) with its Context.Scenes.
has scene
Connecting a (Context or Context.Scene) with its Context.Scenes
Context,Session,Subject
Session.png
2018-02-08T03:08:00
hasSession
Status: *STABLE*
Connects a (Context or Interaction or Subject) with a set of Sessions that are associated with it.
has session
Connecting a (Context or Interaction or Subject) with its related set of Sessions
AnnotationTag
Marker_(SOSA-SSN).png
2016-05-22T18:51:00
hasStimulusEvent
Status: *STABLE*
A StimulusTag is associated with (has) a StimulusEvent.
has stimulus event
Connecting a StimulusTag with its correspondent StimulusEvent.
Session,Subject
Subject.png
2016-06-23T01:55:00
hasSubject
Status: *STABLE*
Connects an Interaction with its set of Subjects.
has subject (participant)
Connecting an Interaction with its set of Subjects
Session
Session.png
2016-06-30T01:43:00
hasSubjectState
Status: *STABLE*
Connects a Session with a set of SubjectStates which describe the overall state of the Subject during the Session.
has subject state
A hasDescriptor sub property for connecting a Session with a set of SubjectStates
Observations
RecordedData_(SOSA-SSN).png
2018-01-17T05:31:00
[oldSSN], [SSN]
hasValue
--Previously, both RecordedData and DataBlock were aligned to sosa:Result.In order to keep a simple model, DataBlock's alignment was removed.Therefore, the property hasDataBlock will be used to connect these two concepts.-- $ 05:05 AM 2018-01-17 $
true
Status: *STABLE*
Connects a RecordedData with its correspondent DataBlock set. This object property is a subproperty of sosa:hasResult (previously was of oldssn:hasValue): [oldssn:SensorOutput] ------ (sosa:hasResult) ------ [oldssn:ObservationValue] [RecordedData] -------------- (hasValue) -------------------- [DataBlock]
A SPARQL triple pattern matching to find the DataBlocks of a Record via this object property would be: ?Record bci:observationResult ?RecordedData ?RecordedData bci:hasValue ?DataBlock Based on the following relationships: [Record] -------- (observationResult) ------ [RecordedData] [RecordedData] ---------- (hasValue) -------------- [DataBlock]
has value (data blocks)
Connecting a RecordedData individual with its correspondent DataBlocks
Context
Context.Scene.png
2018-04-15T03:44:00
[Unity]
includesEvent
Status: *STABLE*
Connects a Context.Scene with a set of Context.Events.
includes event
Connecting a Context.Scene with a set of Context.Events.
Actuation,Results
Actuation.png
2018-04-22T15:43:00
[Seydoux2016]
involves
Status: *STABLE*
An ActuationResult involves an ActuationEvent that causes an effect on the ActuationTarget. Following the Actuation-Actuator-Effect (AAE) design pattern proposed for the IoT Ontology (IoT-O), this concept captures the following definition: [Actuation] ------ (involves) ------ [Effect]
%GENERAL_EXAMPLE%@Actuation-Use-Case
involves
Session,Actuation
Actuation.png
2018-02-08T02:07:00
isActuationOf
Status: *STABLE*
Connects an Actuation with its associated Session.
is actuation of
Connecting an Actuation with its related Session.
SystemCapabilities
MeasurementCapability_(SOSA-SSN).png
2016-08-14T05:28:00
[XDF]
isChannelDataOf
Status: *STABLE*
Connects a Channel with its associated DeviceChannelingSpec.
See general remark about: EEG-CONCEPTS
is channel (logical component) data of
Connecting a Channel with its associated DeviceChannelingSpec definition.
EEG
MeasurementCapability_(SOSA-SSN).png
2016-08-14T05:45:00
[XDF]
isEegChannelDataOf
--Concerning EEG, this ontology only defines its related classes. It does not extend or define any specific properties for EEG.-- $ 04:27 AM 2016-07-29 $
true
Status: *STABLE*
Connects an EegChannel with its associated EegDeviceChannelingSpec.
See general remark about: EEG-CONCEPTS
is EEG channel data of
Connecting an EegChannel with its associated EegDeviceChannelingSpec definition.
Context
2018-04-15T03:44:00
[Unity]
isEventIncludedIn
Status: *STABLE*
Connects a Context.Event with a set of Context.Scenes.
is event included in
Connecting a Context.Event with a set of Context.Scenes.
Actuation
Actuation.png
2018-05-08T18:23:00
isExecutedBy
Status: *STABLE*
A Command is executed by an Actuator.
See general remark about: 2_MAPPINGS-TO-SAN
%GENERAL_EXAMPLE%@Actuation-Use-Case
is executed by
Connecting a Command with an Actuator.
Actuation
Actuation.png
2018-01-10T05:23:00
isInputFor
Status: *STABLE*
A Record is input for a Command.
%GENERAL_EXAMPLE%@Actuation-Use-Case
is input for
Connecting a Record with a Command.
Session,Subject
2018-02-05T04:46:00
isMemberOf
Status: *STABLE*
Connects a set of Sessions and/or Interactions with a Collection.
is member of
Connecting a set of Sessions and/or Interactions with a Collection.
Observations
Aspect-and-Modality_(SOSA-SSN).png
2017-08-31T02:28:00
[SSN]
isModalityOf
Status: *STABLE*
Connects a Modality with its correspondent Aspect. This can be read, as follow: "A Modality is modality of Aspect". This object property is a subproperty of ssn:isPropertyOf: [sosa:ObservableProperty] ------ (ssn:isPropertyOf) ------ [sosa:FeatureOfInterest] [Modality] -------------------- (isModalityOf) -------------------- [Aspect]
is modality of
Connecting a Modality with its correspondent Aspect.
AnnotationTag
Model_(SOSA-SSN).png
2018-01-02T02:45:00
isModelOf
Status: *STABLE*
A Model has associated a set of ResponseTags or FeatureParameters.
is model of
Connecting a Model with its correspondent set of ResponseTags or FeatureParameters.
Observations
Record_(SOSA-SSN).png
RecordedData_(SOSA-SSN).png
2018-01-25T02:30:00
isObservationResultOf
Status: *STABLE*
Connects a RecordedData set with its correspondent Record.
is observation result of (a BCI Record)
Connecting a RecordedData set with its correspondent Record
Context
2018-02-08T02:41:00
isPlayoutInstantOf
Status: *STABLE*
Connects a PlayoutInstant log entry with its correspondent individual (Playout or Context.Event or Action) that issued its creation.
is the playout instant of
Connecting a PlayoutInstant with its correspondent (Playout or Context.Event or Action) individual.
Context
Playout.png
2018-02-08T03:17:00
isPlayoutOf
Status: *STABLE*
Connects a set of Playouts with its corresponding (Context or Session).
is playout record of
Connecting an individual with exactly one (Context or Session)
Observations
RecordedData_(SOSA-SSN).png
2018-01-25T02:51:00
[oldSSN]
isProducedByDevice
Status: *STABLE*
Connects a RecordedData with its correspondent Device that produced it. This can be read, as follow: "A RecordedData is produced by a Device". This supported functionality: RecordedData.sosa:isResultOf * Record.sosa:madeBySensor --> RecordedData.isProducedByDevice := Device.
See general remark about: 1_MAPPINGS-TO-SOSA-SSN
This object property was previously defined as a subproperty of oldssn:isProducedBy: [oldssn:SensorOutput] ------ (oldssn:isProducedBy) ------ [oldssn:Sensor] [RecordedData] -------- (isProducedByDevice) ---------- [Device]
is produced by device
Connecting a RecordedData with its correspondent Device that produced it.
Observations
StimulusEvent_(SOSA-SSN).png
2017-09-14T03:35:00
[SSN]
isProxyFor
Status: *STABLE*
Connects a StimulusEvent with its associated Modality-ies. This can be read, as follow: "A StimulusEvent is proxy for a Modality". This object property is a subproperty of ssn:isProxyFor: [ssn:Stimulus] ------ (ssn:isProxyFor) ------ [sosa:ObservableProperty] [StimulusEvent] ------ (isProxyFor) -------------------- [Modality]
The following descriptions capture the definition of this relation (4.2.13 Stimuli-Centered, 5.3.1.2.1 Stimuli) adjusted to this ontology: (*) The role of StimulusEvents as a proxy between the Device and the object of sensing (Context.Object). (*) A StimulusEvent may only be usable as proxy for a specific region of an observed Modality.
is proxy for
Connecting a StimulusEvent with its associated Modality(ies).
Observations
Record_(SOSA-SSN).png
2016-06-24T00:16:00
isRecordOf
Status: *STABLE*
Connects a Record with its associated (Subject or Session).
is BCI record of
Connecting a Record with its related (Subject or Session).
AnnotationTag
DataSegment.png
2018-04-18T03:16:00
isReferencedBy
Status: *STABLE*
Connects a DataSegment with a set of Markers.
is referenced by
Connecting a DataSegment individual with a set of Markers.
Session
Session.png
2018-02-08T03:33:00
isSessionOf
Status: *STABLE*
Connects a Session with its associated (Context or Interaction or Subject).
is session of
Connecting a Session with its related (Context or Interaction or Subject).
Context,Observations
Marker_(SOSA-SSN).png
2016-06-23T01:55:00
isStimulusEventOf
Status: *STABLE*
A StimulusEvent generates a set of StimulusTags.
is stimulus event of (generates)
Connecting a StimulusEvent with its set of StimulusTags.
Subject
Subject.png
2016-06-23T01:55:00
isSubjectOf
Status: *STABLE*
Connects a Subject with a set of Interactions where he/she participates in.
is subject of (participates in)
Connecting a Subject with a set of Interactions where he/she participates in.
Results
2018-01-17T05:19:00
isValueOf
--Previously, both RecordedData and DataBlock were aligned to sosa:Result.In order to keep a simple model, DataBlock's alignment was removed.Therefore, the property hasDataBlock will be used to connect these two concepts.-- $ 05:05 AM 2018-01-17 $
true
Status: *STABLE*
Connects a DataBlock set with its correspondent RecordedData.
is value of (recorded data)
Connecting a DataBlock individual with its correspondent RecordedData
Context,Subject
Context.png
Subject.png
2018-04-16T00:35:00
[Unity]
issues
Status: *STABLE*
Connects a Subject with a set of Actions that captures the interactions that she performs in the Context.
A Subject interacts with a Context throughout a set of Actions that she issues while performing an Activity during a Session.
issues
Connecting a Subject with related Actions.
EEG
EegRecord_(SOSA-SSN).png
2016-07-22T02:47:00
madeEegRecord
--Concerning EEG, this ontology only defines its related classes. It does not extend or define any specific properties for EEG.-- $ 04:27 AM 2016-07-29 $
true
Status: *STABLE*
Connects an EegDevice with its correspondent EegRecords. This can be read, as follow: "An EegDevice made EegRecords". This object property is a subproperty of madeRecord: [Device] ------------ (madeRecord) ------------ [Record] [EegDevice] ------ (madeEegRecord) ------ [EegRecord]
See general remark about: EEG-CONCEPTS
made EEG record
Connecting an EegDevice individual with its correspondent EegRecord
Sensors
Record_(SOSA-SSN).png
2017-08-31T03:47:00
[SSN]
madeRecord
Status: *STABLE*
Connects a Device with its correspondent Records. This can be read, as follow: "A Device made Records". This object property is a subproperty of sosa:madeObservation: [sosa:Sensor] ------ (sosa:madeObservation) ------ [sosa:Observation] [Device] -------------------- (madeRecord) ------------------ [Record]
See general remark about: EEG-CONCEPTS
made BCI record
Connecting a Device individual with its correspondent Record
Observations
Record_(SOSA-SSN).png
RecordedData_(SOSA-SSN).png
2017-09-14T03:30:00
[oldSSN], [SSN]
observationResult
Status: *STABLE*
Connects a Record with its correspondent RecordedData set. This can be read, as follow: "A Record has as its observation result a RecordedData set". This object property is a subproperty of sosa:hasResult (previously was of oldssn:observationResult): [sosa:Observation] ------ (sosa:hasResult) ------ [oldssn:SensorOutput] [Record] ---------- (observationResult) ------ [RecordedData]
observation result (of a BCI record)
Connecting a Record individual with its correspondent RecordedData set
Observations
Record_(SOSA-SSN).png
2017-08-31T03:47:00
[SSN]
observedByDevice
Status: *STABLE*
Connects a Record with its correspondent Device. This can be read, as follow: "A Record is observed by a Device". This object property is a subproperty of sosa:madeBySensor: [sosa:Observation] ------ (sosa:madeBySensor) ------ [sosa:Sensor] [Record] ------------ (observedByDevice) ---------- [Device]
See general remark about: EEG-CONCEPTS
observed by device
Connecting a Record individual with its correspondent Device
EEG
EegRecord_(SOSA-SSN).png
2016-07-22T02:47:00
observedByEegDevice
--Concerning EEG, this ontology only defines its related classes. It does not extend or define any specific properties for EEG.-- $ 04:27 AM 2016-07-29 $
true
Status: *STABLE*
Connects an EegRecord with its correspondent EegDevice. This can be read, as follow: "An EegRecord is observed by an EegDevice". This object property is a subproperty of observedByDevice: [Record] ------------ (observedByDevice) ------------ [Device] [EegRecord] ------ (observedByEegDevice) ------ [EegDevice]
See general remark about: EEG-CONCEPTS
observed by EEG device
Connecting an EegRecord individual with its correspondent EegDevice
Observations
Record_(SOSA-SSN).png
2016-09-06T05:10:00
[SSN], [ESS]
observedModality
Status: *STABLE*
Connects a Record with its correspondent Modality. This can be read, as follow: "A Record observes a Modality". This object property is a subproperty of sosa:observedProperty: [sosa:Observation] ------ (sosa:observedProperty) ------ [sosa:ObservableProperty] [Record] -------------- (observedModality) -------------------- [Modality]
[ESS 1.0]: A record (bci:Record) has a specific (single) defined modality (bci:Modality). [ESS 2.0]: A record (bci:Session) has a specific defined RecordedParameterSet, which groups various (multiple) RecordedModality (Modality-ies). Therefore, a record (bci:Session) can be associated with multiple RecordedModality (bci:Modality) definitions.
observed modality
Connecting a Record individual with its correspondent Modality
Sensors
Device_(SOSA-SSN).png
MeasurementCapability_(SOSA-SSN).png
Record_(SOSA-SSN).png
2017-09-14T04:02:00
[SSN], [Compton2009]
observes
Status: *STABLE*
[SSN] Relation between a Device (sosa:Sensor) and a Modality (sosa:ObservableProperty) that the sensor supports (can observe). The object property composition (owl:propertyChainAxiom) ensures that if a Record (sosa:Observation) is made of a particular Modality (sosa:ObservableProperty), then one can infer that the Device (sosa:Sensor) supports (observes) that Modality (quality). This supported functionality: (*) Device.madeRecord * Record.observedModality --> Device.observes := Modality. (*) Device.hasDeviceChannelingSpec * DeviceChannelingSpec.hasChannelData * Channel.forModality --> Device.observes := Modality.
See general remark about: 1_MAPPINGS-TO-SOSA-SSN
See general remark about: EEG-CONCEPTS
observes
Connecting a Device individual with its correspondent supported Modality individual.
SystemCapabilities
Aspect-and-Modality_(SOSA-SSN).png
2017-08-09T02:39:00
[SSN]
ofAspect
--The property oldssn:ofFeature was deprecated in SOSA/SSN. Therefore, it will be also deprecated in BCI-O.-- $ 02:39 AM 2017-08-09 $
true
Status: *STABLE*
Connects a Channel to the Aspect is described for. This can be read, as follow: "A Channel is use to describe a property of an Aspect". This object property is a subproperty of ssn:ofFeature: [ssn-system:SystemCapability] ------ (ssn:ofFeature) ------ [sosa:FeatureOfInterest] [Channel] ---------------------- (ofAspect) -------------------- [Aspect] [SSN] A relation from a ssn-system:SystemCapability to the sosa:FeatureOfInterest the capability is described for. (Used in conjunction with ssn:forProperty).
See general remark about: EEG-CONCEPTS
of aspect
Connecting a Channel to the Aspect is described for.
AnnotationTag
DataSegment.png
2018-04-18T03:07:00
pointsTo
Status: *STABLE*
Connects a Marker with a DataSegment.
points to
Connecting a Marker individual with a DataSegment.
Context
Context.png
2018-04-15T23:44:00
[Unity]
raises
Status: *STABLE*
Connects a Context.Method with some related Context.Events as a part of an Context.Object interaction that marks their beginning.
From the perspective of the Object-Oriented Programming paradigm, this relationship captures an object message that defines the beginning of a set of events (Context.Events).
raises
Connecting a Context.Method that origins some Context.Events.
Actuation
Actuation.png
2018-05-08T18:43:00
[Seydoux2016]
triggers
Status: *STABLE*
An Actuator triggers an ActuationEvent that causes an effect on the ActuationTarget. Following the Actuation-Actuator-Effect (AAE) design pattern proposed for the IoT Ontology (IoT-O), this concept captures the following definition: [san:Actuator] ------ (triggers) ------ [san:Effect]
See general remark about: 2_MAPPINGS-TO-SAN
%GENERAL_EXAMPLE%@Actuation-Use-Case
triggers
AnnotationTag
Marker_(SOSA-SSN).png
[Associated data type spec]. Its Domain value consists of approximated arbitrary real numbers in the close range of [0..1].
2016-05-22T21:23:00
hasConfidence
Status: *STABLE*
Captures the accuracy (statistical level of confidence) of the ResponseTag.
Example: 0.75
has confidence
Descriptor
Descriptor_(SOSA-SSN).png
[Associated data type spec].
2016-06-29T00:35:00
[OWL-Time]
hasDateTime
Status: *STABLE*
XSD dateTime associated to an entity. BCI applications should, at least, measure these time values in minutes.
has date time
Observations
Descriptor_(SOSA-SSN).png
Record_(SOSA-SSN).png
[Associated data type spec].
2016-07-19T01:59:00
[ESS], [XDF]
hasEndChannel
Status: *STABLE*
The channel number in the recording where the modality block ends.
has end channel
Context,Session,Observations,Actuation
TimeInterval.png
[Associated data type spec].
2018-02-10T03:44:00
[OWL-Time]
hasEndTime
Status: *STABLE*
XSD dateTime associated to an entity, that indicates the ending-point of a time interval. BCI applications should, at least, measure these time values in seconds.
For simplicity, this ontology does not define explicitly a TimeInterval concept.
has end (final) date time
Observations
Aspect-and-Modality_(SOSA-SSN).png
[Associated data type spec]. Its Domain value is all the positive integers: { 1, 2, 3, ... }
2016-05-24T00:21:00
hasIntensityLevel
Status: *STABLE*
Indicates the level of intensity related to its concept.
Aspect: the measurement of the intensity depends on the nature of the Aspect and purpose of the BCI application.
has intensity level
AnnotationTag,SystemCapabilities
[Associated data type spec].
2018-05-20T23:19:00
[ESS], [XDF]
hasLabel
Status: *STABLE*
A human-readable and descriptive label for general identification purposes associated to an instance.
If necessary, BCI applications may extend the definition of this attribute to specify the preferred notation scheme, using the semantic annotation skos:notation.
has label
(*) In a Channel: this attribute is used for search (access) purposes, according to a preferred labeling scheme (see the editorial note). (*) In a Marker: this attribute indicates a marker type. For example: "110" = "Red light being flashed".
Descriptor,Observations
Descriptor_(SOSA-SSN).png
RecordedData_(SOSA-SSN).png
[Associated data type spec].
2018-05-19T19:42:00
hasLocator
Status: *STABLE*
An IRI locator to access a Web resource.
A http-schemed IRI has the ability for content-type negotiation; and thus, it can process the media type of the Web resource.
has locator (IRI)
(*) AccessMethod: access to the Web resource that represents the Data File that storages the Record. (*) Descriptor: external Web resource IRI.
Observations
RecordedData_(SOSA-SSN).png
[Associated data type spec].
A broker (or MQTT Server) conforms to the following definition in the latest OASIS MQTT specification: MQTT Version 3.1.1 Plus Errata 01(OASIS Standard Incorporating Approved Errata 01 10 December 2015) -- 1.2 Terminology -- (Server).
As defined in: MQTT Version 3.1.1 Plus Errata 01(OASIS Standard Incorporating Approved Errata 01 10 December 2015) -- 1. Introduction -- (1.5 Data representations).
2016-07-05T04:35:00
hasMQTT.Broker
--This ontology should be agnostic to the data access-- $ 04:35 AM 2016-07-05 $
true
Status: *STABLE*
Describes the broker (a MQTT Server) parameter in an AccessMethod.MQTT connection.
has MQTT broker
Observations
RecordedData_(SOSA-SSN).png
[Associated data type spec].
A Client Identifier (or ClientId) conforms to the following definition in the latest OASIS MQTT specification: MQTT Version 3.1.1 Plus Errata 01(OASIS Standard Incorporating Approved Errata 01 10 December 2015) -- 3. MQTT Control Packets / 3.1. CONNECT / 3.1.3. Payload -- (3.1.3.1. Client Identifier).
The Client Identifier MUST be a UTF-8 encoded string as defined in Section 1.5.3 of the latest OASIS MQTT specification: MQTT Version 3.1.1 Plus Errata 01(OASIS Standard Incorporating Approved Errata 01 10 December 2015) -- 1. Introduction -- (1.5 Data representations).
2016-07-05T04:35:00
hasMQTT.ID
--This ontology should be agnostic to the data access-- $ 04:35 AM 2016-07-05 $
true
Status: *STABLE*
Describes the Client Identifier (ClientId) parameter in an AccessMethod.MQTT connection. This parameter identifies the Client to the MQTT Server.
has MQTT ID
Observations
RecordedData_(SOSA-SSN).png
[Associated data type spec].
A Topic conforms to the following definitions in the latest OASIS MQTT specification: MQTT Version 3.1.1 Plus Errata 01(OASIS Standard Incorporating Approved Errata 01 10 December 2015) -- (*) 3. MQTT Control Packets / 3.3 PUBLISH -- Publish message -- (3.3.2.1. Topic Name). (*) 3. MQTT Control Packets / 3.8 SUBSCRIBE -- Subscribe to topics -- (3.8.3 Payload).
The Topic MUST be a UTF-8 encoded string as defined in Section 1.5.3 of the latest OASIS MQTT specification: MQTT Version 3.1.1 Plus Errata 01 (OASIS Standard Incorporating Approved Errata 01 10 December 2015) -- 1. Introduction -- (1.5 Data representations).
2016-07-05T04:35:00
hasMQTT.Topic
--This ontology should be agnostic to the data access-- $ 04:35 AM 2016-07-05 $
true
Status: *STABLE*
Describes the Topic (name or filter) parameter in an AccessMethod.MQTT connection. This parameter identifies either: (*) The Topic Name (for the PUBLISH message), which identifies the information channel to which payload data is published. (*) The Topic Filter (for the SUBSCRIBE message), which corresponds to the Clients interest in one or more Topics (each subscription registers a Clients interest in a Server).The payload of a SUBSCRIBE packet contains at least one Topic Filter.
has MQTT topic
The usage of this parameter (for either subscribe to specific topics) depends on the purpose and implementation of the BCI application.
AnnotationTag
Marker_(SOSA-SSN).png
[Associated data type spec].
2016-05-23T02:03:00
hasModelIRI
Status: *STABLE*
It's the IRI of the resource that describes or represents the Model or classifier. A Model can be described in any language or format, such as PMML.
has model IRI
Sensors,Observations
Descriptor_(SOSA-SSN).png
Record_(SOSA-SSN).png
[Associated data type spec].
=== EEG 10/20 system channeling schema. ===
2016-10-12T01:57:00
[ESS], [XDF]
hasNumberOfChannels
Status: *STABLE*
Captures the number of channels used in a Record or supported by a Device. Its value is expected to be a positive integer.
[XDF]: channel_count is a non-negative integer that encodes the number of channels in the stream. [ESS 1.0]: number of (used) data channels.
Instead of using this generic datatype property, some BCI applications could define the following specific attributes, according to the recording setup, DeviceChannelingSpec and Record's Modality: (*) Number of used LEDs. (*) Number of used cameras.
has number of channels
Results
DataBlock_(SOSA-SSN).png
[Associated data type spec].
2016-06-12T04:17:00
hasOffset
Status: *STABLE*
Indicates the milliseconds.
has offset
Results
DataBlock_(SOSA-SSN).png
[Associated data type spec].
2016-06-12T03:44:00
hasOrdinalPosition
Status: *STABLE*
Indicates the ordinal position of the DataBlock.
has (ordinal) position
Observations
Record_(SOSA-SSN).png
[Associated data type spec].
2016-07-19T02:55:00
[XDF]
hasSampleCount
Status: *STABLE*
Counts the number of samples of the Record.
[XDF] It is defined in the StreamFooter chunk as sample_count.
has sample count
Observations
Record_(SOSA-SSN).png
[Associated data type spec].
2017-08-30T23:58:00
[ESS], [XDF]
hasSamplingRate
Status: *STABLE*
Sampling rate of the recording (Record). Its measurement unit is Hertz (Hz).
Related concept for a Device: SamplingRate.
[ESS 2.0] Sampling rate of the modality may be recorded at a different sampling rate.
(a record) has sampling rate
Observations
Descriptor_(SOSA-SSN).png
Record_(SOSA-SSN).png
[Associated data type spec].
2016-07-19T01:59:00
[ESS], [XDF]
hasStartChannel
Status: *STABLE*
The channel number in the recording where the modality block starts.
has start channel
Context,Session,Observations,Actuation
TimeInterval.png
[Associated data type spec].
2018-02-10T03:44:00
[OWL-Time]
hasStartTime
Status: *STABLE*
XSD dateTime associated to an entity, that indicates the starting-point of a time interval. BCI applications should, at least, measure these time values in seconds.
For simplicity, this ontology does not define explicitly a TimeInterval concept.
has start (initial) date time
AnnotationTag
Marker_(SOSA-SSN).png
[Associated data type spec].
2016-05-22T21:23:00
hasState
Status: *STABLE*
Captures the alphabet symbol of the ResponseTag, representating its "State". BCI domain applications can define their own specific symbolic scheme to represent its relevant States.
An alphabet-based symbolic scheme, could be: "A B C D E...". Thus, a specific State, would be: {"B"}
has state
AnnotationTag,Context,Results,SystemCapabilities
Marker_(SOSA-SSN).png
PlayoutInstant.png
[Associated data type spec].
2016-06-29T00:34:00
[OWL-Time]
hasTimeStamp
Status: *STABLE*
XSD dateTimeStamp of a specific time instant. BCI applications are recommended to measure the time with high precision, in order to keep a proper granularity for this measurement unit. BCI applications should, at least, measure the time instants in milliseconds.
has time stamp
Context,Descriptor,Session
[Associated data type spec].
2016-09-29T03:24:00
hasTitle
Status: *STABLE*
The given title or logical name of an entity. It is used to associate a human-readable label to entities.
has title
File
Stream
Observations
RecordedData_(SOSA-SSN).png
[Associated data type spec].
2016-06-01T01:55:00
hasType
Status: *STABLE*
Indicates the AccessMethod's type: the nature of how the RecordedData can be accessed. This ontology only defines the two following Access Method Type: (*) "File": archived access method. (*) "Stream": real-time access method.
has access method type (nature)
1
1
Observations
RecordedData_(SOSA-SSN).png
=== ** definition ** "(a network communication protocol and its parameters)" (*) The identity scheme of the data access. (*) The security scheme of the protocol. ** scopeNote ** This class is intended to be the root of a class hierarchy. Domain-specific applications can extend this class hierarchy for their own purposes to describe the network communication protocols that BCI applications can use to access and retrieve the RecordedData. This ontology defines some of the aforementioned network communication protocols, commonly used by BCI applications. ** editorialNote ** This ontology does not define in undefined-general-purpose datatype properties to capture the following corresponding definitions: (*) For the identity scheme of the data access: hasIdentity. (*) For the security scheme of the protocol: hasSecurity and hasQoS. ===
2016-07-06T04:34:00
AccessMethod
Status: *STABLE*
This concept captures any computer network mechanism (network communication protocol) through which the RecordedData can be accessed. An AccessMethod represents any specific standard communication protocol, such as: MQTT, MQTT-SN, HTTP, CoAP, FTP, etc. For the purpose of this ontology, an AccessMethod captures only the following components: (*) Nature of the data access: stream. (*) The locator (address) of the data.
The publish/subscribe mechanism is one of the preferred implementations for BCI applications.
The concept will be merged into the generic abstraction of container under the oneM2M spec, and made compatible with its emerging semantic extension. A container represents a "data collection" and, therefore, it's directly related to the RecordedData concept.
BCI data access method
%APPLICATION%@cerebratek_nupod
File
Observations
RecordedData_(SOSA-SSN).png
2016-07-05T04:35:00
AccessMethod.CoAP
--This ontology should be agnostic to the data access-- $ 04:35 AM 2016-07-05 $
true
Status: *STABLE*
Represents a CoAP AccessMethod. Similarly as HTTP, this software protocol supports IRI and content-type negotiation.
CoAP access method
Stream
1
1
1
Observations
RecordedData_(SOSA-SSN).png
2016-07-05T04:35:00
AccessMethod.MQTT
--This ontology should be agnostic to the data access-- $ 04:35 AM 2016-07-05 $
true
Status: *STABLE*
Represents a MQTT AccessMethod, which is a machine-to-machine (M2M)/"Internet of Things" connectivity protocol. MQTT was designed as an extremely lightweight publish/subscribe messaging transport. This concept describes the corresponding definition of access parameters needed for a suitable MQTT connection.
MQTT Access Method
File
Observations
RecordedData_(SOSA-SSN).png
2016-07-05T04:35:00
AccessMethod.RESTful-JSON
--This ontology should be agnostic to the data access-- $ 04:35 AM 2016-07-05 $
true
Status: *STABLE*
Represents a RESTful AccessMethod, where the data is exchanged in JSON format.
RESTful-JSON access method
1
Session,Subject,Context
Activity.png
Subject.png
Context.png
2018-06-05T23:32:00
Action
Status: *STABLE*
Describes a type of Context.Event issued (issues) by a Subject while performing a specific Activity in a Session. Actions are considered to be structural components of an Activity, which are done by the Subject while interacting with the Context. As an interaction event, an Action can register many PlayoutInstant.SubjectActions in a Playout.
The concept Action represents a special type of Context.Event with the sole purpose of identifying the set of related events that a Subject (only for subjects) can issue while performing an Activity during the interaction with the Context.
action
This class is intended to be the root of a class hierarchy. Domain-specific applications can extend this class hierarchy for their own purposes to describe any relevant Action classification that Subjects can do, while performing an Activity.
1
0
Session,Subject
Activity.png
2018-05-11T23:42:00
Activity
Status: *STABLE*
Activity represents the Subject's physical state while interacting with the Context during a Session. This concept describes an Activity performed by the Subject on a specific Session while interacting with a Context. BCI applications monitor the Subject's physical state during the Sessions. This concept identifies the type of Activity that the Subject is performing while recording the data in a Session. Hence, each Session associates a single Subject interactions with a single Context while performing a single Activity. An Activity can be break down as a set of Actions, performed by the Subject while interacting with the Context.
Relationship between Activity and Aspect: (*) Commonly, BCI applications are designed to analyze how an Activity influences an Aspect: it's part of the research scheme and purpose of a BCI application. However, the BCI-O model allows an Activity to be linked with, possibly, multiple Aspects through its Session (a session connects to one activity) and its Records (a session has multiple records, and each record has its own aspect). (*) From the perspective of a BCI application, an Activity has a "main" Aspect to analyze; i.e., multiple Records of the same Session connect to the same Aspect. (*) A SPARQL triple pattern matching that connects Activity(ies) to Aspects would be: ?Session hasActivity ?Activity ?Session hasRecord ?Record ?Record bci:aspectOfInterest ?Aspect
(*) The concept of Activity is agnostic regarding the number of Subjects engaging in an individual Activity. The ontology clearly defines that the connection between Subjects and Activity(ies) is through Sessions: one Session associates one Subject performing one Activity while interacting with one Context. (*) BCI applications can use this concept as a way to annotate/mark (Marker) the Records (DataSegments).
Some subclasses of this concept could be: (*) Glaucoma Tracking: This Activity type is a common example for "pre-screening" of Subjects. (*) Learning: BCI applications can apply different Stimuli (StimulusEvent) to the Subjects. This Activity type is a common example for "interactive" observations. (*) Sleeping: BCI applications don't apply any kind of Stimuli to the Subjects (there are no StimulusEvent). This Activity type is a common example for "running" observations: continuous observations.
activity
This class is intended to be the root of a class hierarchy. Domain-specific applications can extend this class hierarchy for their own purposes to describe any relevant Activity classification that Subjects can engage on in Sessions.
1
1
1
Actuation
Actuation.png
2018-05-08T18:19:00
[SSN], [Seydoux2016]
Actuation
Status: *STABLE*
[SSN] Carries out an (actuation) procedure to change the state of the world using an Actuator. The relationships from and to Actuation and other concepts are the ones defined at [SSN]. For this ontology, the following sosa:Actuation qualified restricted properties with exact cardinality type restriction are of importance for the definition of Actuation: (*) sosa:madeByActuator EXACTLY 1 (sosa:Actuator): restricts the association to exactly 1 Actuator. (*) sosa:hasFeatureOfInterest EXACTLY 1 (sosa:FeatureOfInterest): restricts the association to exactly 1 ActuationTarget.
See general remark about: 2_MAPPINGS-TO-SAN
See general remark about: PROCEDURES
%GENERAL_EXAMPLE%@Actuation-Use-Case
actuation
This class is intended to be the root of a class hierarchy. Domain-specific applications can extend this class hierarchy for their own purposes to describe any relevant Actuation classification that Subjects can engage on in Sessions.
1
Actuation,Context
Actuation.png
2018-05-08T18:53:00
[Seydoux2016]
ActuationEvent
Status: *STABLE*
Represents a transition (something that has changed from a state to a different one: ActuationTarget) ─a modification (ImpactedProperty)─ in the Context as the result of an actuation (ActuationResult involves ActuationEvent). From the Context perspective, this concept is a Context.Event (triggers(-ed) by an Actuator) that changes the ImpactedProperty of the ActuationTarget. [Seydoux2016] Following the Actuation-Actuator-Effect (AAE) design pattern proposed for the IoT Ontology (IoT-O), this concept is taken from the following relationships involving the san:Effect definition: [san:Actuator] ------ (triggers) ------ [san:Effect] [san:Actuation] ------ (involves) ------ [san:Effect] [san:Effect] -------- (impacts) ------ [ImpactedProperty]
See general remark about: 2_MAPPINGS-TO-SAN
san:Effect is defined in [SAN] as: "Concept bound to the definition of an actuator as an agent having an effect on the physical world. Therefore, an effect is any kind of physical modification induced by an actuator." In order to be more semantically precise, and based on the <a title="Semantic Sensor Network Ontology | W3C Recommendation (19 October 2017): (6) Vertical Segmentation -- (6.1) Dolce-Ultralite Alignment Module SOSA/SSN definitions aligned with Dolce-Ultralite (DUL), the concept san:Effect is described distinctively by the following combined ontological notions: (*) A happening that impacts a quality (DUL:Quality), or property (ssn:Property), with the capability of an Actuation to act on it (sosa:actsOnProperty), that is, a type of sosa:ActuatableProperty; the ImpactedProperty. (*) An event (DUL:Event) induced by (triggers) an Actuator, that modifies (changes) the physical world (ActuationTarget): a type of Context.Event; the ActuationEvent. (*) (An effect is seeing as...) Any kind (of an ImpactedProperty) of physical modification (ActuationEvent changes ActuationTarget) as the result of an actuation (ActuationResult involves ActuationEvent).
If needed, BCI applications can time stamp an ActuationEvent.
%GENERAL_EXAMPLE%@Actuation-Use-Case
actuation event
This class is intended to be the root of a class hierarchy. Domain-specific applications can extend this class hierarchy for their own purposes to describe any relevant ActuationEvent classification.
1
Actuation,Results
Actuation.png
2018-05-08T18:53:00
[SSN], [Seydoux2016]
ActuationResult
Status: *STABLE*
[SSN] It represents the result of an Actuation, i.e. an entity representing the "effect" of the Actuation, which involves an ActuationEvent. [Seydoux2016] Following the Actuation-Actuator-Effect (AAE) design pattern proposed for the IoT Ontology (IoT-O), this concept expands the following relationship: [Actuation] ------ (involves) ------ [Effect]
See general remark about: 2_MAPPINGS-TO-SAN
If needed, BCI applications can time stamp an ActuationResult.
%GENERAL_EXAMPLE%@Actuation-Use-Case
actuation result
This class is intended to be the root of a class hierarchy. Domain-specific applications can extend this class hierarchy for their own purposes to describe any relevant ActuationResult classification.
Actuation,Context
Actuation.png
2018-03-23T04:35:00
[SSN], [Seydoux2016]
ActuationTarget
Status: *STABLE*
The following concepts encompass its modeling depiction: (*) [SSN] A sosa:FeatureOfInterest: the thing (ActuationTarget) whose property (ImpactedProperty) is being manipulated by an Actuator. (*) [SSN] Related from an sosa:Actuation via the property sosa:hasFeatureOfInterest: a relation between an Actuation and the entity (ActuationTarget) whose property (ImpactedProperty) was modified. (*) A Context.Object: a thing (object) in the interaction Context of the Session. [Seydoux2016] Following the Actuation-Actuator-Effect (AAE) design pattern proposed for the IoT Ontology (IoT-O), this concept captures the definition of FeatureOfInterest from the following relation: [ImpactedProperty] ------ (is property of) ------ [FeatureOfInterest]
%GENERAL_EXAMPLE%@Actuation-Use-Case
[SSN] A window is an ActuationTarget (sosa:FeatureOfInterest) for an automatic window control Actuator.
actuation target
This class is intended to be the root of a class hierarchy. Domain-specific applications can extend this class hierarchy for their own purposes to describe any relevant ActuationTarget classification that Subjects interacts with (changes its state) in Sessions.
1
1
1
Actuation
Actuation.png
2018-05-08T18:39:14
[SSN], [Seydoux2016]
Actuator
Status: *STABLE*
[SSN] A device that is used by, or implements, an (actuation) procedure that changes (triggers) the state of the world (ActuationTarget).
See general remark about: 2_MAPPINGS-TO-SAN
(*) "An actuator is a component of a machine that is responsible for moving or controlling a mechanism or system.". (*) "An actuator requires a control signal and a source of energy: [ControlSignal]". (*) "An actuator is the mechanism by which a control system *acts* upon an environment.". Reference: [Wikipedia: Actuator]
[Seydoux2016] Actuators are devices that transform an input signal into a physical output, making them the exact opposite of sensors. SAN (ontology) is built around Actuation-Actuator-Effect (AAE), a design pattern inspired from the Stimulus Sensor Observation (SSO). Based on the focus of this ontology and due that actuators are the exact opposite of sensors, the concepts of channel for Actuators, are not defined. If needed, BCI applications can model the system capabilities of an actuator following directly SOSA's structure and concepts.
%GENERAL_EXAMPLE%@Actuation-Use-Case
actuator
This class is intended to be the root of a class hierarchy. Domain-specific applications can extend this class hierarchy for their own purposes to describe any relevant Actuator classification that Subjects can use in Sessions.
0
Descriptor,Actuation
Actuation.png
Descriptor_(SOSA-SSN).png
2017-12-11T03:41:00
ActuatorSpec
Status: *STABLE*
An ActuatorSpec is an information object that describes specific properties (such as: hardware specs, power used, types of interfaces, etc.) of an Actuator. Similarly to DeviceSpec, the structure of ActuatorSpec has been modeled as a composite object so that it can be composed as a set of ActuatorSpecs to describe specific parts of an Actuator. In this way, an ActuatorSpec is considered as a bag of descriptive properties about the Actuator. An ActuatorSpec is a specialized Descriptor. An ActuatorSpec can be used to record any descriptive information related to the physical actuator component.
This ontology does not define any information object in particular of an ActuatorSpec.
actuator specification
This class is intended to be the root of a class hierarchy. Domain-specific applications can extend this class hierarchy for their own purposes to describe specific types of ActuatorSpec.
1
Observations
Aspect-and-Modality_(SOSA-SSN).png
2017-08-20T19:50:00
[SSN], [Compton2009]
Aspect
Status: *STABLE*
[SSN] It's the classification of sosa:FeatureOfInterest in the course of an sosa:Observation for BCI Activities. This concept captures the view or interpretation for the Records, and thus, it defines the purpose and/or scope of BCI Activities observations.
See general remark about: ASPECT-and-MODALITY
The following descriptions capture the definition of this concept ([oldSSN: FeatureOfInterest] and [Compton2009]) adjusted to this ontology: (*) An Aspect is an abstraction of BCI activities performed by humans, from the human body's state perspective. (*) Devices observe physiological signals (Modality-ies) of Aspects: for example, the EEG signals (Modality) of an emotion (Aspect). (*) Aspects are human body's states that are the target of sensing.
aspect
This class is intended to be the root of a class hierarchy. Domain-specific applications can extend this class hierarchy for their own purposes to describe the Aspects of the Records. Three main Aspects are defined in this ontology.
0
1
SystemCapabilities
MeasurementCapability_(SOSA-SSN).png
2017-08-30T19:51:00
[SSN], [XDF], [ESS]
Channel
Status: *STABLE*
A Channel is a relevant metadata set that defines a logical component schema of a DeviceChannelingSpec's data structure model. A Channel is defined as a specialized ssn-system:SystemCapability type that describes a compounded set of measurement properties (ssn-system:SystemProperty-ies), as explained in [SSN] System Capabilities Module and [oldSSN] MeasuringCapability Module. A Channel is associated to a DeviceChannelingSpec definition and, hence, to a RecordChannelingSpec. As part of a channeling spec, a Channel definition can be extended to incorporate contextual metadata semantics (i.e., properties to describe dimensional characteristics regarding what, when, how --including mathematical formulas for calculations--, where, why, etc.), depending on the DataFormat used for the data files. This information can be associated to a channel definition via a Descriptor set.
See general remark about: UNITS-OF-MEASUREMENT
BCI systems naturally collects and transmits data from a sender (transmitter machine) to a receiver (receiving machine). In a general and abstract way, a channel is used to convey an information signal, for example a digital bit stream, from one or several senders (or transmitters) to one or several receivers. The concept of channel defined in this ontology aims to capture a relevant metadata set that describes the measurement properties for any type of channel.
Channels are the logical components of Records. Their structural relationship resembles a matrix, in the following way: [ [[[ [ [ [ [[rows] Channels are different data rows.] ] [ [-->] [ ][ ] [ ][ ] ] [ [-->] [ ][ ] [ ][ ] ] [ [-->] [ ][ ] [ ][ ] ] [ [-->] [ ][ ] [ ][ ] ] ] ] ][ [ [ [ [[columns] Data samples are different data columns. They correspond to specific time instances.] ] [ [ -V- ][ -V- ] [ -V- ][ -V- ] [ -V- ] ] [ [ ] [ ][ ] [ ][ ] ] [ [ ] [ ][ ] [ ][ ] ] [ [ ] [ ][ ] [ ][ ] ] ] ] ]]] ] Hence, Channels can point to specific parts of a Record.
Channel Structure: The channel structure is composed of different related metadata and attributes. It varies widely depending on the following: (*) the related/associated Modality. (*) the functional "role" that plays in a Device's data model, and, (*) the way how it's used in a specific Record settings. Due that a channeling spec is directly associated with a Device and a Record, in theory, a channel defines a specific (or proper) logical data structure component for a Device and/or a Record. The most common metadata and attributes of a channel structure, regardless of its nature, are: (*) Label: defined as hasLabel. (*) Type: defined as the channel class type (class hierarchy). (*) Placement (or Location): refers to the attribute set that define its placement on a Subject. The placement's attribute structure varies widely depending on the channel's nature.
A simplistic notion regarding the relationship and difference between the concepts of Channel and DataFormat, is depicted in the following example: If the used DataFormat were "CSV", then the channeling schema (DeviceChannelingSpec) would define the data's logical structure: [Channeling Schema] = { Col1: ID, Col2: Date, Col3: Name, ... }, where each column represents a specific Channel definition. Note that each column has its own related metadata and attributes; also, it follows its proper structure, format or notation scheme, etc.
(*) [XDF] Eye-Gaze Channel: channeling metadata for an Eye-Gaze Record. (*) CoordinateSystem = { World-Space, Object-Space, Camera-Space, or Image-Space }: coordinate system of the respective parameter. (*) RefersTo = { Left, Right, or Both }: which eye the channel is referring to. (*) Type: Type of data in this channel. It can be any of the following values: (*) { ScreenX, ScreenY }: screen coordinates of the gaze cursor (can also refer to a scene image); usually in pixels. (*) { DirectionX, DirectionY, DirectionZ }: 3D gaze vector in some coordinate system. (*) { PositionX, PositionY, PositionZ }: 3D position of the eye center in some coordinate system. (*) { IntersectionX, IntersectionY, IntersectionZ }: 2D or 3D position of the intersection point with a plane (in some coordinate system). (*) { HeadX, HeadY, HeadZ }: 3D location of the head center in some coordinate system. (*) { PupilX, PupilY, PupilZ }: 2D or 3D location of the pupil center in some coordinate system. (*) { ReflexX, ReflexY, ReflexZ }: 2D or 3D location of the illuminator's reflection point in some coordinate system. (*) { Radius or Diameter }: the overall pupil radius or diameter (usually in mm or pixels). (*) { RadiusX, RadiusY }: horizontal and vertical pupil radius. (*) { DiameterX, DiameterY }: horizontal and vertical pupil diameter. (*) { Confidence }: for confidence information (preferred unit: normalized). (*) { FrameNumber }: frame number that the parameters were calculated from. (*) { PlaneNumber or ObjectId }: number or identifier of the object that was intersected by the gaze vector. (*) Keyboard-Hit Channel: channeling metadata for a Keyboard-Hit Record. (*) [XDF] Hand-Gesture Channel: channeling metadata for a Hand-Gesture Record. (*) Type = { Confidence, OrientationH, OrientationP, OrientationR, PositionX, PositionY, PositionZ }: type of data. [following types from GazeMetaData "LeapMotion_xml_output" definitions.] (*) Mouse-Click Channel: channeling metadata for a Mouse-Click Record. (*) Type = { PositionX, PositionY }. (*) Button = { Left, Other, Right, Wheel }.
channeling data (logical component)
This class is intended to be the root of a class hierarchy. Domain-specific applications can extend this class hierarchy for their own purposes to describe specialized channel definitions related to specific Modality types.
Descriptor,Observations
Aspect-and-Modality_(SOSA-SSN).png
Descriptor_(SOSA-SSN).png
2016-08-08T23:54:00
ChannelingSpec
Status: *STABLE*
Each Modality defines its own specific channeling schema information: a complete, generic and descriptive set of all possible Channels and their extended metadata attributes that defines the data structure model and template of the Modality. A ChannelingSpec captures the complete description of the channeling schema information, in a form of an external document specification (outside the metadata repository). Similar to the DeviceSpec concept, a ChannelingSpec is a specialized Descriptor.
A channeling schema information related to all kind of EegRecords would define around 32 fields (i.e., EegChannels) to describe a complete data structure regarding EEG data. The full specification for this channeling schema information would be associated with the generic EegModality concept. A proper name for this spec would be EegChannelingSpec.
channeling schema spec
This class is intended to be the root of a class hierarchy. Domain-specific applications can extend this class hierarchy for their own purposes to describe any relevant type of a ChannelingSpec to capture the external information that defines the complete channeling schema information of a Modality.
Observations
Aspect-and-Modality_(SOSA-SSN).png
2016-05-24T00:53:00
CognitiveAspect
Status: *STABLE*
Describes the classification of CognitiveAspects. One application for this Aspect is Learning.
cognitive aspect
This class is intended to be the root of a class hierarchy. Domain-specific applications can extend this class hierarchy for their own purposes to describe the CognitiveAspects of the Records.
0
1
1
Session
Session.png
2016-06-28T04:57:00
[ESS]
Collection
Status: *STABLE*
Groups a collection of related Sessions and/or Interactions, which may be associated with different Activity(ies).
(*) Collection generalizes the concept of Study as defined in [ESS]: a set of data collection efforts to answer one or few related scientific questions. (*) This concept defines a longitudinal (temporal) collection of Sessions.
collection
1
Actuation
Actuation.png
2018-05-11T18:45:00
[Seydoux2016]
Command
Status: *STABLE*
Represents a specific order (based on a Record) to an Actuator to perform an Actuation. Typically, it depicts an instruction (or signal) that causes an Actuator to perform (executes) one of its basic functions, and thus, triggering an Actuation. A Command has the following intrinsic characteristics: (*) Defines the input for a set of Actuators. (*) Its source is a set of Records. [Seydoux2016] From the Actuation-Actuator-Effect (AAE) design pattern proposed for the IoT Ontology (IoT-O), this concept is based on the following definition: [Actuator] ------ (consumes) ------ [Input] [Actuator] -------- (executes) ---- [Command]
See general remark about: 2_MAPPINGS-TO-SAN
%GENERAL_EXAMPLE%@Actuation-Use-Case
command
This class is intended to be the root of a class hierarchy. Domain-specific applications can extend this class hierarchy for their own purposes to describe specific types of Commands.
0
1
1
1
1
Context
Context.png
2016-04-17T21:45:00
[Shafer2001], [Unity]
Context
Status: *STABLE*
In [Shafer2001] we find the following definition of Context: Dey et al. define context as "any information that can be used to characterize the situation of entities (i.e., whether a person, place, or object) that are considered relevant to the interaction between a user and an application" (p. 106). Thus, context awareness implies two attributes of a system: the ability to obtain context and the ability to utilize contextual information. For the purpose of this ontology, a Context is the architectural description of the environment (external settings, components and procedures) on which a Subject interacts with it, during a Session.
A classification (class hierarchy) for Context has been not yet defined.
(*) Physical descriptions of any environment. (*) Simulations. (*) Video Games. (*) Virtual Reality environments.
context
1
Context,Subject
Context.png
2018-04-12T23:47:00
Context.AutonomousBeing
Status: *STABLE*
Any self-contained and self-governed Context.Object able to react to Context.Events (stimuli) and act on its own, based on its specific Context.Capability-ies. This concept encompasses any living organism, such as humans and animals.
From the Context perspective, the architectural description of Context.AutonomousBeings, along with their Context.Capability-ies, gives a framework to the following modeling premises: (*) A Subject is a special kind of Context.AutonomousBeing. (*) From a Subject perspective, the notion of "act on its own" implies that a Subject can (Context.Capability) issues multiple Actions. (*) A Subject interacts with a Context through her Actions.
(*) Human beings (*) Animals (*) "Intelligent" machines/devices/artifacts (with well-defined Context.Capability-ies)
context autonomous being
This class is intended to be the root of a class hierarchy. Domain-specific applications can extend this class hierarchy for their own purposes to describe any Context.AutonomousBeing entities.
Context,Subject
Context.png
2018-04-15T23:47:00
Context.Capability
Status: *STABLE*
A capability is the ability of a Context.AutonomousBeing to perform (canPerform) or achieve certain actions (see: Action) or outcomes.
The architectural description of Context.Capability gives a framework to the following modeling premises: (*) A Context.Capability may raise (raises) Context.Events. (*) Through her Context.Capability-ies, a Subject can issue (issues) Actions.
capability
This class is intended to be the root of a class hierarchy. Domain-specific applications can extend this class hierarchy for their own purposes to describe any Context.Capability entities.
1
1
Context
Activity.png
Context.png
2018-04-19T00:47:00
[Unity]
Context.Event
Status: *STABLE*
Captures a change of state on a set of related Context.Objects through the effectuation of their behavior (Context.Methods) in a time frame. Context.Event model any kind of "happening" or "occurrence" among a set of Context.Objects in the time line of a Context.Scene (temporality).
Regarding the alignment to DUL:Event: Below there are two groups of important notes and modeling considerations about its ontological implications: (*) Regarding its "ontological definition": (*) Defines a relationship to a time interval. Hence, it has an intrinsically temporal nature. (*) An event does not represent a capability from any object but a temporal object participation. (*) Architecturally speaking, it is natural to model sequences of related Context.Events: DUL:precedes/DUL:follows relations. (*) This ontology followed a modeling approach of participant-based classification of events. (*) Regarding "Causality": DUL:Event's definition refers to the "causality" nature of an event in two alternative set of views: (*) Aspectual views: "as a transition (something that has changed from a state to a different one)". (*) Intentionality views: "the causal analysis generates situations with different identities, according to what DUL:Description is taken for interpreting the DUL:Event". (...) "If intentionality is a criterion to classify events or not, this depends on if an ontology designer wants to consider causality as a relevant dimension for events' identity". The architectural description of events in a Context in this ontology fits the "Aspectual" view for the nature of causality: as a transition between states based on the Context.Methods of the Context.Objects.
Some event types from the Gaming domain are: (*) PointerEnter (*) PointerExit (*) PointerDown (*) PointerUp (*) PointerClick (*) UpdateSelected (*) InitializePotentialDrag (*) BeginDrag (*) EndDrag
context event
This class is intended to be the root of a class hierarchy. Domain-specific applications can extend this class hierarchy for their own purposes to describe any Context.Event entities. This class hierarchy includes the subclasses StimulusEvent, ActuationEvent, and Action.
1
0
Context
Context.png
2018-04-15T23:47:00
[Unity]
Context.Method
Status: *STABLE*
A function, workflow, protocol, plan, algorithm, or computational method specifying how (set of formal rules) to perform an operation (usually represented as a verb-phrase) associated with a set of Context.Objects that defines a perspective of their expected behavior, which may change the state of related Context.Objects, and may raises Context.Events. A Context.Method is reusable and might be involved in many Context.Events. A Context.Method defines the steps to be carried out as part of an expected behavior associated with a set of Context.Objects.
The architectural description of methods in a Context in this ontology is based on the following modeling premises: (*) Object-Oriented Programming: This concept resembles the notion of method (of a class). (*) Unity domain: This concept is equivalent to the Script component. (*) BCI domain: This concept captures the primitives for a "Protocol" or "Procedure" in classical BCI experiments, that is, the behavior (methods) of the target Context.Objects that the Subject needs to pay attention to during a Session while performing an Activity.
From the Unity Gaming Modeling Architecture, the following components could be defined as methods: (*) Physics: A Context.Method that can define specific behavior based on Physics models. (*) Transform: A Context.Method that defines the logic of how a Context.Object can move. (*) Protocol or Procedure: some BCI applications may need to define a set of Context.Objects that the Subject needs to pay attention to.A specific type of a Transform could be defined to represent the logical behavior or movement of a Context.Object based on an algorithm.
Some method types from the Gaming domain are: (*) Drag (*) Drop (*) Scroll (*) Select (*) Deselect (*) Move (*) Jump (*) Run (*) Roll-Over (*) Hit (*) Throw (*) Beep (*) On (*) Off (*) Open (*) Close (*) Submit (*) Cancel
context method
This class is intended to be the root of a class hierarchy. Domain-specific applications can extend this class hierarchy for their own purposes to describe any Context.Method entities. This class hierarchy includes the subclass Context.Capability.
0
1
0
1
1
1
Context
Context.png
2018-04-15T23:10:00
[Unity]
Context.Object
Status: *STABLE*
Captures the architectural description of a DUL:Object (any spatially located DUL:Entity with occurrences --temporality-- in some DUL:Events) that participates interactively in a Context.Scene with a specific Context.Role. A Context.Object has the following characteristics: (*) Spatial Location: Its scope is bound to a Context.Role for a specific Context.Scene. (*) Structural Composition: It is composed of a collection of Context.Objects. Hence, the structure of a Context.Object is a set of composite Context.Objects. (*) Functional Behavior: It is defined throughout a set of Context.Methods that capture the range of suited operations that can be performed to serve its Context.Role. This is the reason behind the interaction among Context.Objects. (*) Timeline (Period of Existence): It is bound to its participation in Context.Events for a specific Context.Scene.
(*) This concept resembles the notion of object in the Object-Oriented Programming paradigm: Context.Objects interact with one another (raises Context.Events) through their Context.Methods. (*) A special characterization of Context.Objects through the concept Context.AutonomousBeing, sets a proper architectural framework to model Subjects and Actions from the Context perspective.
Taken directly from the Unity Gaming Modeling Architecture, below are listed some components that may be defined as Context.Object types: (*) Audio: object with audio capabilities. (*) Camera: object with specific visual capabilities for perspectives. (*) Effects: object that can define specific visual effects. (*) Layout: object that can define specific layout configurations. (*) Video: object with video recording capabilities. From its Modeling Architecture, the description of "capabilities" in Unity gives a solid reference point to the notions of Context.Method and Context.Capability.
context object
This class is intended to be the root of a class hierarchy. Domain-specific applications can extend this class hierarchy for their own purposes to describe the contextual interactive entities that participate in any Context.Scene. This class hierarchy includes the subclasses Context.AutonomousBeing and ActuationTarget.
Describing contextual interactive objects in any Context.Scene.
0
1
Context
Context.ObjectComponent.png
2018-03-14T03:35:00
[Unity]
Context.ObjectComponent
--Following closely the alignment to DUL, the concepts about Objects and Events are distinctly separated. Therefore, from a structural perspective, a Context.ObjectComponent is a Context.Object except for Context.ObjectComponent.Event which is changed to Context.Event.--$ 02:57 AM 2018-03-14 $
true
Status: *STABLE*
Captures the architectural description of a stand-alone entity ("logical" or "physical") that structurally forms part of a Context.Object.A Context.ObjectComponent can be compose of Context.ObjectComponents.
(*) Audio: A Context.ObjectComponent with audio capabilities. (*) Camera: A Context.ObjectComponent that defines a specific visual perspective for the Subject. (*) Effects: A Context.ObjectComponent that can define specific visual effects. (*) Layout: A Context.ObjectComponent that can define specific layout configurations. (*) Physics: A Context.ObjectComponent that can define specific behavior based on Physics models. (*) Transform: A Context.ObjectComponent that defines the logic of how an entity can move. (*) Protocol or Procedure: some BCI applications may need to define a set of Context.Objects that the Subject needs to pay attention to.A specific type of a Transform could be defined to represent the logical behavior or movement of an entity based on an algorithm. (*) Video: A Context.ObjectComponent with video recording capabilities.
context object component
This class is intended to be the root of a class hierarchy. Domain-specific applications can extend this class hierarchy for their own purposes to describe the entities that can form any Context.Object.
Context
Context.Role.png
2018-04-12T04:50:00
[Unity]
Context.Role
Status: *STABLE*
Specifies the purpose, mission or functional classification (a DUL:Role) of a Context.Object in a Context.Scene that defines the notions of its expected nature from the perspectives of: (*) Structure: What is it?. (*) Behavior: How does interact?. A Context.Object has one only Context.Role.
This ontology does not define any specific Context.Role subclasses.
From the Unity Gaming Modeling Architecture, the following are common roles found in any scene: (*) Character: The Context.Object participates as a character in the Context.Scene. That is, as an autonomous animated object (Context.AutonomousBeing) that interacts directly (in the foreground) with the Subject. For some BCI applications, this role describes all the Context.Objects that forms a "Protocol" or "Procedure"; that is, the ones that the Subject needs to pay attention to (the "target" objects). (*) Property: The Context.Object participates as a property in the Context.Scene. That is, as a co-dependant object that can influence the configuration of any object in a Context.Scene. (*) Scenery: The Context.Object participates as part of the scenery in the Context.Scene. That is, as an (autonomous animated) object (may be a Context.AutonomousBeing) that interacts indirectly (in the background) with the Subject. Some BCI applications implement simple Contexts, based solely on two structural roles for Context.Objects: (*) Background. (*) Foreground: where the "Protocol" or "Procedure" is implemented. Hence, the classification described above can be mapped with this terminology in the following way: (*) Background is equivalent to the scenery role. (*) Foreground is equivalent to the character role.
context object role
This class is intended to be the root of a class hierarchy. Domain-specific applications can extend this class hierarchy for their own purposes to describe the different ways ("Roles") in that Context.Objects can participate in a Context.Scene.
The classification of how a Context.Object participates in a Context.Scene.
0
1
0
0
1
1
Context
Context.Scene.png
2018-04-16T22:24:00
[Unity]
Context.Scene
Status: *STABLE*
An ordered temporal part of a Context that captures possible relevant contextual interactions, i.e., a collection of Context.Objects interplaying with one another (sequence of Context.Events) in a specific way. A Context is compose of a non-empty sequence of Context.Scenes, based on the time dimension (temporality): a Context.Scene is related to other Context.Scenes based on its temporality (occurrence in its sequence). A Context.Scene can be composed of multiple Context.Scenes. The architectural description of a Context.Scene entity is depicted in the following way: (*) Structural: its associated Context.Objects (and their compositions). (*) Functional: the associated Context.Methods (and related Context.Events) via the Context.Objects that comprise its structure. (*) Temporal: the collection of all sequences of included Context.Events. A Context.Scene corresponds to the notion of World or Level on a Gaming platform.
(*) "In mathematics, a sequence is an ordered collection of objects in which repetitions are allowed. (...) Unlike a set, order matters, and exactly the same elements can appear multiple times at different positions in the sequence". Reference: [Wikipedia: Sequence]
(*) On a Simulation: [driving a car in a not-busy day on a freeway]; [driving a car in rush hour on a main city road]. (*) On a Video Game: World 3; Level 3-2.
context scene
1
1
1
1
1
Results
DataBlock_(SOSA-SSN).png
=== ** previous definition of isValueOf ** <rdfs:subClassOf> <owl:Restriction> [ [1] [ ] ] ===
2018-01-17T05:25:00
[oldSSN], [SSN]
DataBlock
Status: *STABLE*
A DataBlock represents the basic/atomic physical data unit value of a RecordedData. Hence and following the previous definition in [oldSSN], a RecordedData [has as for Value] a non-empty sequence (with not allowed repetitions) of DataBlocks. In the BCI domain: (*) A RecordedData is compose of a non-empty sequence (with not allowed repetitions) of DataBlocks. (*) A DataBlock is considered a physical data entity (whereas a DataSegment is considered a logical data entity). (*) A DataBlock is related to another DataBlock based on its temporality: occurrence in its sequence. Therefore, a DataBlock may be sequentially linked to a following and a previous DataBlocks. (*) All the sequentially linked DataBlocks compose the value of a RecordedData. (*) The mechanism to access a DataBlock is the following: (*) First, one retrieves the locators of the AccessMethods associated to desired RecordedData. With this, one can have the access to the "data file". (*) Then, one can derived the correspondent DataBlocks locators, using the positional attributes: ordinal position, offset or timestamp.
See general remark about: 1_MAPPINGS-TO-SOSA-SSN
"In mathematics, a sequence is an ordered collection of objects in which repetitions are allowed. (...) Unlike a set, order matters, and exactly the same elements can appear multiple times at different positions in the sequence". Reference: [Wikipedia: Sequence] For this specific concept, a sequence of DataBlocks does not allow repetitions.
Depending on its implementation nature, a BCI application may choose to use any (or both) of the positional attributes: (*) hasTimeStamp. (*) hasOffset.
BCI data block
0
Observations
RecordedData_(SOSA-SSN).png
2016-08-17T23:58:00
DataFormat
Status: *STABLE*
This concept describes any Data Format that BCI applications use to represent and store the RecordedData. A DataFormat represents any specific standard data format used in one of the following ways: (*) Signal (electrical engineering). (*) File format. (*) Content format.
A DataFormat should properly define its encoding scheme.
BCI data format
This class is intended to be the root of a class hierarchy. Domain-specific applications can extend this class hierarchy for their own purposes to describe the data formats that BCI applications can use to represent and store the RecordedData. This ontology defines some common data formats used in the BCI domain.
1
0
0
AnnotationTag
DataSegment.png
2016-06-12T22:05:00
DataSegment
Status: *STABLE*
It is a sequentially linked DataBlocks, and thus, identifies a proper subset of a RecordedData. A time interval is implicitly found between the first DataBlock (startTime) and the last DataBlock (endTime) of the DataSegment. In the BCI domain: (*) A DataSegment conforms the basic data unit for "tagging" purposes, i. e., to associate semantic annotations (event tags or Markers) to the data (DataBlock sets). (*) A DataSegment is a collection of DataBlocks that expands a certain time interval. (*) A DataSegment is considered a logical data entity (whereas a DataBlock is considered a physical data entity).
Right after a set of DataSegments is created, two consecutive tasks occur: (*) A set of editing processes is run on the recordings, which classify the data sets based on different Models. (*) A set of ResponseTags is created, which capture the information related to "what is so special about" a particular DataSegment.
data segment
0
1
1
1
Descriptor
Descriptor_(SOSA-SSN).png
2016-06-30T01:30:00
[ESS], [XDF]
Descriptor
Status: *STABLE*
Describes an external Web resource that complements the information related to a specific entity found in this ontology. In a general sense, represents a class of information objects that describes metadata. Each individual of this class refers to an external file/document with extensive information to the associated metadata object. A Descriptor can have a related Descriptor set. This concept is defined as a subclass of DUL:InformationObject.
[XDF] and [ESS] BCI applications may extend this concept based on its purpose as an information object (practical usage of the Web resource). Thus, some subclasses of this concept could be: (*) Annotation. (*) Channel locations. (*) Descriptive metadata. (*) Event instance. (*) Experiment note. (*) Specification.
descriptor of an external resource
This class is intended to be the root of a class hierarchy. Domain-specific applications can extend this class hierarchy for their own purposes to describe any relevant type of Descriptors to capture specific external information that complements a metadata object.
1
0
1
1
Sensors
Device_(SOSA-SSN).png
=== ** scopeNote ** This class is intended to be the root of a class hierarchy. Domain-specific applications can extend this class hierarchy for their own purposes to describe any relevant type of BCI devices to measure specific Modality(ies) for BCI activities, such as: (*) EEG (Electroencephalogram) device. (*) ECG (Electrocardiogram) device. (*) MoCap (Motion Capture) device. Example: LeapMotion (MoCap) Tracker. (*) Eye-Gaze device (for gaze or eye-tracking). Example: EyeTribe. (*) Audio device. (*) Video device. (*) Hand-Gesture device. (*) Keyboard device. (*) Mouse device. (*) Visual BCI device. This ontology does not define all the BCI devices listed above. ===
2017-08-20T22:40:00
[XDF], [SSN], [Compton2009]
Device
Status: *STABLE*
[SSN] A Device is a physical piece of technology (a system in a box) that implements a sensing method (similar as the concept of oldssn:SensingDevice) and, thus, observe some Modality (a sosa:ObservableProperty) of an Aspect (a sosa:FeatureOfInterest). In the BCI domain, a Device is a physical BCI device (or sensor) that is used to measure BCI activities. A Device, of course, collects data (represented by RecordedData) in an sosa:Observation (a Record).
See general remark about: 1_MAPPINGS-TO-SOSA-SSN
[Compton2009] According to the Sensor Ontology, a sensor has a set of independent cluster of concepts: (*) Domain: FeatureOfInterest and PhysicalQuality. (*) Abstract properties: OperationModel that defines a ResponseModel. (*) Concrete properties: SensorGrounding. A Device in this ontology corresponds to the concepts for Abstract properties (#2).
[oldSSN] Based on the guidelines explained in the following examples: (*) (5.3.12 Device), (*) (5.4.1 University deployment example -- 5.4.1.3 Sensor), (*) (5.4.2 Smart product example -- 5.4.2.2 Sensor), (*) (5.4.3 Wind sensor (WM30) -- 5.4.3.2 Wind Sensor system), and (*) (5.4.4 Agriculture Meteorology Sensor Network -- 5.4.4.1.3 Sensor view), Some of the core restrictions modeled initially from oldssn:SensingDevice are: (*) For ssn-system:SystemCapability, two distinct kinds of measurement capabilities are identified and defined: (*) Those used for defining the Channeling Spec: a set of Channels. These are associated indirectly via the DeviceChannelingSpec concept. (*) Other measurement capabilities not related to any channel definition: NonChannels. These are associated directly via a ssn:hasMeasurementCapability sub property, as follow: ssn:hasMeasurementCapability (hasNonChannelData) only ssn-system:SystemCapability (NonChannel): multiple instances. (*) ssn:observes (observes) only sosa:ObservableProperty (Modality): for BCI, it's implied that it only has one instance. (*) ssn:detects (detects) only ssn:Stimulus (StimulusEvent): multiple instances; details (what made) the sosa:Sensor input.
BCI device
This class is intended to be the root of a class hierarchy. Domain-specific applications can extend this class hierarchy for their own purposes to describe any relevant type of BCI devices to measure specific Modality(ies) for BCI activities.
1
1
Descriptor,Sensors
Aspect-and-Modality_(SOSA-SSN).png
Descriptor_(SOSA-SSN).png
Device_(SOSA-SSN).png
2016-08-08T23:13:00
DeviceChannelingSpec
Status: *STABLE*
Each Device supports a specific channeling schema information: all the supported logical components (Channels) and their extended metadata that describe a "more concrete" subset of its Modality's data structure model and template, based on the Device's own physical spec of its operational features and functionalities. A DeviceChannelingSpec captures two information sets for a specific Device: (*) Its complete channeling schema description, in a form of an external document specification (outside the metadata repository): a specialized Descriptor. (*) Relevant metadata attributes regarding the specific characteristics of the Device's channeling schema: a set of related Channels. The structure described in a DeviceChannelingSpec (first information set mentioned above) is a functional subset of the ChannelingSpec defined for the Modality that the Device supports, following the [SSN] data model. Hence, for practical reasons, a DeviceChannelingSpec is defined as a subclass of ChannelingSpec.
Theoretically, a DeviceChannelingSpec could be defined as a specialized DeviceSpec concept. However, for practical reasons, this ontology aligns the definition of its first information set with ChannelingSpec.
device channeling schema spec
This class is intended to be the root of a class hierarchy. Domain-specific applications can extend this class hierarchy for their own purposes to describe any relevant type of a DeviceChannelingSpec to capture the external information that defines the channeling schema information of a Device.
0
Descriptor,Sensors
Descriptor_(SOSA-SSN).png
Device_(SOSA-SSN).png
=== ** scopeNote ** (*) Manufacturer (source [XDF], [ESS]): manufacturer of the sensor (device). { History note } (*) [ESS 1.0] Corresponds to the (/study/summary/recordedModalities/modality/recordingDevice) node definition: name or type of recording device used to acquire data (manufacturer name). (*) [ESS 2.0] Corresponds to the (/study/recordingParameterSets/recordingParameterSet/channelType/modality/name) node definition: the name (brand) of the sensor device. For example: BioSemi, OptiTrack, SMI, etc. ===
2017-08-31T00:21:00
[SSN], [XDF], [ESS]
DeviceSpec
Status: *STABLE*
[SSN], [XDF] A DeviceSpec is an oldssn:SensorDataSheet (information object) that records (describes) specific properties (such as: hardware specs, power used, types of connectors, etc.) of a Device. It has been modeled as a composite object so that it can be composed as a set of DeviceSpecs to describe specific parts of a Device. In this way, a DeviceSpec is considered as a bag of descriptive properties about the Device. A DeviceSpec is a specialized Descriptor. The relevant set of a Device's properties are recorded directly (with hasChannelData and hasNonChannelData), but the DeviceSpecs can be used to record any other descriptive information related to the physical device, such as: (*) to record the manufacturers' specifications verses observed capabilities, or (*) if more is known than the manufacturer specifies, etc.
(*) The channeling schema that supports a Device is defined as an independent component from the DeviceSpec. A Device's channeling schema (DeviceChannelingSchema) is a subset of the generic ChannelingSchema defined for its correspondent Modality. (*) This ontology does not define any information object in particular of a DeviceSpec.
Some BCI applications based on [XDF], find important to keep information regarding the hardware specifications of its Devices. Hence, a BCI application could define a classification for different type of specifications, such as: (*) Hardware specs: (*) Manufacturer (source [XDF], [ESS]): manufacturer of the sensor (device). (*) Material (source [XDF]): conductive material of the sensor (e.g. Ag-AgCl, Foam, Plastic, Rubber). (*) Model (source [XDF]): model of the sensor. (*) Serial number (source [XDF]): serial number of the device. Its generalization was taken from the description of the "Gaze meta data". (*) Ownership specs: (*) Name / Label: a logical human-readable name or label of the device. (*) Organization: organization name that owns the device.
device specification
This class is intended to be the root of a class hierarchy. Domain-specific applications can extend this class hierarchy for their own purposes to describe specific types of DeviceSpec.
EEG
MeasurementCapability_(SOSA-SSN).png
2018-05-20T23:25:00
[XDF], [ESS]
EegChannel
Status: *STABLE*
Defines a broader type of an EEG Channel (channeling scheme information component), used in BCI applications to collect EEG (Electroencephalography) data. This concept relates directly to the notion of an electrode capturing brainwave activity.
See general remark about: EEG-CONCEPTS
BCI applications based on [ESS] and [XDF], could define the following channeling metadata attributes for an EegModality spec: (*) Label: (it could be defined as part of the RecordChannelingSpec, if it is the same value for all the channels) (*) [ESS 2.0]: a comma separated list of labels corresponding channels. This node is required for EEG Modality. (*) [XDF]: EEG channel label, according to the labeling scheme. For EEG, the preferred labeling scheme is 10-20 (or the finer-grained 10-5). (*) Placement: (it could be defined as part of the RecordChannelingSpec, if it is the same value for all the channels) (*) [ESS 2.0]: location of the reference channel or channels used during EEG or ECG recording. Should only be provided if the ModalitySignalType (Modality) is EEG or ECG. For EEG, the preferred location convention is presented below. Choose between the following values (or provide a new value if the reference is not any of these options): {"Right Mastoid", "Left Mastoid", "Mastoids","Linked Mastoids" [for electrically linked mastoids],"Cz" [top of the head],"CMS" [e.g. in BIOSEMI],"Left Ear","Right Ear","Ears","Average","Nasion","Nose"}. For Wilson Central Terminal ECG reference use "WCT". (*) [XDF]: (*) { LocationX, LocationY, LocationZ }: 3D position (measured location) of the electrode on the head's surface based on a coordinate system (frame of reference). Each value is described as: (*) { LocationX }: coordinate axis pointing from the center of the head to the right, in millimeters. (*) { LocationY }: coordinate axis pointing from the center of the head to the front, in millimeters. (*) { LocationZ }: coordinate axis pointing from the center of the head to the top, in millimeters. XDF states that if the used coordinate system is arbitrary, the application should then include well-known fiducials (landmarks) for co-registration. (*) LocationType = { 10-10, 10-20, 10-5, Custom, EGI }: channel location type/standard used. (*) [XDF] ChannelFormat = { double64, float32, int16, int32, int64, int8, string }: corresponds to the channel_format field of the StreamHeader chunk section. It's one of the 3 required fields in the XDF header. (*) [XDF] Signal Referencing Scheme: (*) isCommonAverage: (boolean data type); "true" if the subtracted reference signal was a common average, otherwise "false". (*) isSubtracted: (boolean data type); "true" if a reference signal has already been subtracted from the data, otherwise "false". BCI applications should include their own relevant dictionaries (placement, formats, etc.), as part of their proprietary extended semantic definitions.
EEG channel
This class is intended to be the root of a class hierarchy. Domain-specific applications can extend this class hierarchy for their own purposes to describe any relevant EegChannels, based on specialized EegModality(ies), that BCI applications may require.
0
1
EEG
Device_(SOSA-SSN).png
2016-08-05T03:51:00
EegDevice
Status: *STABLE*
Defines a broader type of an EEG Device, used in BCI applications to collect EEG (Electroencephalography) data.
See general remark about: EEG-CONCEPTS
EEG device
This class is intended to be the root of a class hierarchy. Domain-specific applications can extend this class hierarchy for their own purposes to describe any relevant EegDevices, based on specialized EegModality(ies), that BCI applications may require.
EEG
Aspect-and-Modality_(SOSA-SSN).png
2018-06-06T23:30:00
EegModality
Status: *STABLE*
A specific type of Modality for EEG (Electroencephalography). This modality can be further classified depending on different measurement procedures, applications and set of stimuli.
See general remark about: EEG-CONCEPTS
EEG modality
This class is intended to be the root of a class hierarchy. Domain-specific applications can extend this class hierarchy for their own purposes to describe any relevant EEG modalities. Following, we present a possible classification for this type: (*) ERP (Event Related Potential -voltage-): related to a stimuli. (*) Visually Evoked Potential (VEP): Video. (*) Glaucoma (mfVEP - Vision Field Sensitivity). (*) TVEP: Transient. (*) AEP: Aural (*) "Free Run".
EEG
MeasurementCapability_(SOSA-SSN).png
2016-08-10T06:03:00
EegNonChannel
Status: *STABLE*
The NonChannel of a specific EegDevice.
See general remark about: EEG-CONCEPTS
non-channeling EEG data component (other EEG measurement capability)
This class is intended to be the root of a class hierarchy. Domain-specific applications can extend this class hierarchy for their own purposes to describe any relevant NonChannels specific for an EegDevice.
1
EEG
Record_(SOSA-SSN).png
2016-08-05T04:09:00
EegRecord
Status: *STABLE*
Defines a broader type of an EEG Record, which represents the class of observations for EEG (Electroencephalography) data.
See general remark about: EEG-CONCEPTS
EEG record
This class is intended to be the root of a class hierarchy. Domain-specific applications can extend this class hierarchy for their own purposes to describe any relevant EegRecords, based on specialized EegModality(ies), that BCI applications may require.
Observations
Aspect-and-Modality_(SOSA-SSN).png
2016-05-24T00:56:00
EmotionalAspect
Status: *STABLE*
Describes the classification of EmotionalAspects.
emotional aspect
This class is intended to be the root of a class hierarchy. Domain-specific applications can extend this class hierarchy for their own purposes to describe the EmotionalAspects of the Records.
1
AnnotationTag
FeatureParameter.png
2018-04-16T03:27:00
FeatureParameter
Status: *STABLE*
It is an auxiliary data analytics related to an "epoch" (pointed by a Marker) in the raw data (DataBlock) that describes an event of interest in the analysis (its semantics are captured by a ResponseTag). In practice, a set of FeatureParameters characterize the content of a ResponseTag. A FeatureParameter is commonly formally defined by an underlying mathematical/statistical Model, which describes how it is calculated. FeatureParameters are: (*) Relevant computed data (used-by or built) in developed algorithms/Models that analyse the raw dataset (DataBlock). (*) Pinpointed for each "epoch" of the raw dataset recording (DataBlock). An epoch is characterized by a ResponseTag marker; the temporal tagging of epochs (frequency-based, duration/interval, timestamped) depend on both the nature of the data and the analytical Models used over them. Thus, most of the features are considered to be transient information objects, i.e., they have an expiration date.
Some feature parameters for mfSSVEP (Steady-State Visually Evoked Potential with Vision Field Sensitivity) data sets for glaucoma patients are: (*) CCA correlations between the mfSSVEP signals and the set of reference signals (sinusoids or binary sequences) derived from each visual stimulus (one per each visual stimulus). (*) CCA coefficients of the reference signals that yield maximum correlation with the mfSSVEP signals (one per each reference signal and each visual stimulus). (*) Spectral Power Density (PSD) of the optimal combination of mfSSVEP signals maximally correlated with the set of reference signals derived from each visual stimulus (one per each visual stimulus). Note: the above parameters may be the differences between the PSDs when the visual stimulus was presence and absent from the stimuli patterns.
feature
This class is intended to be the root of a class hierarchy. Domain-specific applications can extend this class hierarchy for their own purposes to describe any FeatureParameter entities.
Actuation
Actuation.png
2018-05-09T23:53:00
[SSN], [Seydoux2016]
ImpactedProperty
Status: *STABLE*
[SSN] It represents an actuatable quality (property or characteristic) of an ActuationTarget. An Actuator connects to an ImpactedProperty via the object property ssn:forProperty. [Seydoux2016] Following the Actuation-Actuator-Effect (AAE) design pattern proposed for the IoT Ontology (IoT-O), this concept captures the definition of ImpactedProperty (linked to san:Effect) from the following relationships: [san:Effect] ------------------ (impacts) -------------- [ImpactedProperty] [ImpactedProperty] ------ (is property of) ------ [sosa:FeatureOfInterest] An Actuator triggers an ActuationEvent that causes an effect (modification) on the ActuationTarget: ImpactedProperty.
See general remark about: 2_MAPPINGS-TO-SAN
For a complete interpretation about the san:Effect definition and its related semantic notions for this ontology, please refer to the editorial note in ActuationEvent.
%GENERAL_EXAMPLE%@Actuation-Use-Case
[SSN] A window actuator (Actuator) acts by (triggers) changing the state (ActuationEvent changes) between a frame and a window (ActuationTarget). The ability of the window to be opened and closed is its ImpactedProperty.
impacted property (as a consequence of an actuation effect)
This class is intended to be the root of a class hierarchy. Domain-specific applications can extend this class hierarchy for their own purposes to describe any relevant ImpactedProperty classification.
0
2
2
0
1
Session,Subject
Subject.png
2018-04-17T21:05:00
Interaction
Status: *STABLE*
A situation where multiple (more than one) Subjects interact with each other, while each is performing a single Activity. Commonly, it is expected that all the Subjects in an Interaction engage in the same Activity, but it is not required. Due that an Interaction groups a set of Sessions (potentially many for each Subject), BCI applications can make correlations among these Sessions.
(*) This concept defines a Cluster of Sessions: a cross-sectional collection of multiple related Sessions that occur at the same time. (*) In an Interaction where multiple Subjects are performing different Activity-ies under the same Context, the Actions done by the Subjects are modeled as different Context.Events in (possible) multiple Context.Scenes.
Interaction of multiple subjects
1
1
1
AnnotationTag
Marker_(SOSA-SSN).png
2018-04-17T23:10:00
Marker
Status: *STABLE*
Corresponds to the "entry points" of the annotation tags of the data.
annotation tag (or data segment pointer)
This class is intended to be the root of a class hierarchy. Domain-specific applications can extend this class hierarchy for their own purposes to describe the type of Markers (or Annotation Tags) that define "entry points" in DataSegments. This ontology defines two types of Markers: the ResponseTag and the StimulusTag.
0
1
Observations
Aspect-and-Modality_(SOSA-SSN).png
=== ** scopeNote ** This class is intended to be the root of a class hierarchy. Domain-specific applications can extend this class hierarchy for their own purposes to describe any relevant type of human signals (Modality Signal Type --nature of the data--) analyzed by BCI applications, such as: (*) EEG (Electroencephalogram). (*) ECG (Electrocardiogram). (*) MoCap (Motion Capture). (*) Eye-Gaze (for gaze or eye-tracking). (*) Audio. (*) Video. (*) Hand-Gesture. (*) Keyboard-Hit. (*) Mouse-Click. This ontology does not define all the modalities listed above. ===
2016-08-20T05:19:00
[SSN], [Compton2009]
Modality
Status: *STABLE*
[SSN]: A Modality is a kind of an "Observable Quality", i. e., an aspect (the human signals) of an entity (the human body) that is intrinsic to and cannot exist without the entity and is observable by a sensor (Device). In the BCI domain, the Modality defines a certain type of measurement (classification) related to a specific kind of data due to its nature. Literally, Modality means the "Mode of the data". The Modality defines, in an intrinsic manner, the operational functionality of any Device based on its related ChannelingSpec information. That is, a specific type of Device operates for a specific type of Modality: the nature of the data sensed. Each Modality must have its own complete and generic ChannelingSpec information.
See general remark about: ASPECT-and-MODALITY
A Modality has its own specific: (*) Measurement procedures, (*) Applications (each one with relevant attributes), and (*) Stimuli.
The following descriptions capture the definition of this concept ([oldSSN: Property] and [Compton2009]) adjusted to this ontology: (*) An observable Quality of human physiological signals. That is, a characteristic of an Aspect (human body's state) that is intrinsic to and cannot exist without the Aspect and is observable by a Device. (*) Devices observe physiological signals (Modality-ies) of Aspects: for example, the EEG signals (Modality) of an emotion (Aspect).
[ESS 1.0]: (*) This data object describes the name of the different type of modalities recorded in a study. Corresponds to the (/study/summary/recordedModalities/modality/name) node definition. [ESS 2.0]: (*) It contains information about one or more set of recording data parameters (which can apply to multiple Records). (*) Corresponds to the (/study/recordingParameterSets/recordingParameterSet) node definition. (*) Most studies have only a single parameter set, i. e., the same types of data (EEG, Mocap, etc.) are recorded in the same channel ranges, with the same device types and with the same sampling rates. (*) This "recordings parameter set" is associated with Records nodes (which represent the "dataRecording" nodes).
recorded modality
This class is intended to be the root of a class hierarchy. Domain-specific applications can extend this class hierarchy for their own purposes to describe any relevant type of human signals (Modality Signal Type --nature of the data--) analyzed by BCI applications.
0
1
1
1
1
1
AnnotationTag
Model_(SOSA-SSN).png
2018-04-16T00:38:00
Model
Status: *STABLE*
Describes a Machine Learning Model (commonly, a mathematical optimization or computational statistics algorithm for predictive analytics) that "detects something" in a DataSegment. A common name given for a Model is classifier.
In the BCI domain, a Model can generate many different results related to a ResponseTag: each one, can be depicted as a FeatureParameter.
Right after a set of DataSegments is created, two consecutive tasks occur: (*) A set of editing processes is run on the recordings, which classify the data sets based on different Models. (*) A set of ResponseTags ;is created, which capture the information related to "what is so special about" a particular DataSegment.
model
Observations
Aspect-and-Modality_(SOSA-SSN).png
2016-05-24T00:47:00
NeurologicalAspect
Status: *STABLE*
Describes the classification of NeurologicalAspects. One application for this Aspect is glaucoma monitoring.
neurological aspect
This class is intended to be the root of a class hierarchy. Domain-specific applications can extend this class hierarchy for their own purposes to describe the NeurologicalAspects of the Records.
SystemCapabilities
MeasurementCapability_(SOSA-SSN).png
=== ** editorialNote ** The BCI ontology defines this modeling structure for EEG Devices: see the definition of the Object Property hasEegNonChannelData. ===
2017-08-30T22:34:00
[SSN]
NonChannel
Status: *STABLE*
[SSN] The NonChannel of any Device describes a set of measurement properties (ssn-system:SystemProperty-ies) of a sensor (sosa:Sensor) in specific conditions, as explained in [SSN] System Capabilities Module and [oldSSN] MeasuringCapability Module, that are not related directly to any DeviceChannelingSpec. Note that the measurement properties describe in this concept are of a sensor (Device subclass of sosa:Sensor), not of a specific observed measurement (Record subclass of sosa:Observation).
See general remark about: 1_MAPPINGS-TO-SOSA-SSN
See general remark about: UNITS-OF-MEASUREMENT
This ontology leaves open to BCI applications the way how they should describe properly basic non-channeling measurement capabilities for its relevant set of different classes of sensors (Device class hierarchy) used in BCI activities. A modeling guideline (based on [oldSSN]: oldssn:MeasurementCapability) can be found at: (*) (5.3.5 MeasuringCapability -- 5.3.5.2 How to describe capabilities of a sensor?), (*) (5.4.2 Smart product example -- 5.4.2.3 Measurement capabilities). The sensor ontology does not restrict the way in which specific measurement properties (oldssn:MeasurementCapability) are described. Thus, specialized applications may defined their own values of measurement properties (oldssn:MeasurementCapability). (oldssn:MeasurementCapability maps to ssn-system:SystemCapability) If necessary, BCI applications should (but are not require to) define a set of restrictions and specialized connections (subproperties) on the property hasNonChannelData (subproperty of ssn-system:hasSystemCapability) for each particular subclass of Device (subclass of sosa:Sensor), which describes sensors for specific types. A relevant non-channeling measurement property (ssn-system:SystemProperty) related to a Device is sampling rate. Based on the modeling of the ssn-system:SystemCapability and ssn-system:SystemProperty concepts (along with the guidelines found in 5.4.2 Smart product example -- 5.4.2.2 Sensor), this ontology defines the SamplingRate concept.
non-channeling data component (other BCI measurement capability)
This class is intended to be the root of a class hierarchy. Domain-specific applications can extend this class hierarchy for their own purposes to describe a set of relevant non-channeling measurement capabilities for each type of Device.
0
1
1
1
1
Session,Context
Playout.png
2018-04-06T02:32:00
Playout
Status: *STABLE*
Describes the data logging (recording) of the dynamic state of the Context: the "play out" of the happenings (Context.Events). A Playout consist of many PlayoutInstant(ces).
playout record
1
Session,Context
PlayoutInstant.png
2018-04-06T03:03:00
PlayoutInstant
Status: *STABLE*
Captures any relevant entry log in a Playout (related to Context.Events). Two specific type of instances are defined: (*) PlayoutInstant.SubjectAction. (*) PlayoutInstant.ContextEvent.
playout instant
This class is intended to be the root of a class hierarchy. Domain-specific applications can extend this class hierarchy for their own purposes to describe the types of log entries in a Playout. Two important types of log entries are defined in this ontology.
Describing entities that form any log entry in a Playout.
Session,Context
PlayoutInstant.png
2018-02-11T03:03:00
PlayoutInstant.ContextEvent
Status: *STABLE*
Captures a relevant entry log in a Playout of a Context Event issued by a Context.Event instance during a Session.
playout instant: context event type
Events (Context.Event) issued in a Context during a Session.
Session,Context
PlayoutInstant.png
2018-02-11T02:57:00
PlayoutInstant.SubjectAction
Status: *STABLE*
Captures a relevant entry log in a Playout of a Subject's Event issued by an Action instance during a Session.
playout instant: subject action type
Events (Actions) issued by a Subject during a Session.
Observations
RecordedData_(SOSA-SSN).png
Google Protocol Buffers DataFormat
2016-06-03T02:25:00
ProtocolBuffersDataFormat
Status: *STABLE*
Represents a Protocol Buffers DataFormat, which is a language-neutral, platform-neutral, extensible way of serializing structured data for use in communications protocols and data storage.
protocol buffers BCI data format
0
1
1
1
1
1
1
1
1
1
1
1
1
1
1
Observations
Record_(SOSA-SSN).png
=== ** scopeNote ** This class is intended to be the root of a class hierarchy. Domain-specific applications can extend this class hierarchy for their own purposes to describe any relevant type of BCI records generated by BCI devices for BCI activities, such as: (*) EEG (Electroencephalogram) record. (*) ECG (Electrocardiogram) record. (*) MoCap (Motion Capture) record. (*) Eye-Gaze record (for gaze or eye-tracking): A BCI Record that stores the coordinates of user's eye gaze captured by eye trackers. (*) Audio record. (*) Video record. (*) Hand-Gesture record: A BCI Record that stores the coordinates and velocities of user's hands and fingers captured by trackers such as LeapMotion. (*) Keyboard-Hit (keystroke) record: A BCI Record that stores the subject's hits on different keyboard keys. (*) Mouse-Click record: A BCI Record that stores the position coordinates of a subject's clicks with different mouse buttons. (*) Visual BCI record. This ontology does not define all the BCI records listed above. ===
2018-01-29T02:36:00
[SSN], [ESS]
Record
Status: *STABLE*
A Record is a type of sosa:Observation with the following characteristics: (*) A single sosa:Observation for a specific unimodal BCI data capture task (with its own purpose). (*) [SSN] A Sensing Method (procedure) is used to estimate or calculate a value of a specific sosa:ObservableProperty (Modality) based on a specific sosa:FeatureOfInterest (Aspect). (*) A single Device is used to observe the unimodal BCI data (RecordedData). [oldSSN]: Record, along with its related concepts, defines an appropiate structure based on the following description: "An observation (Record) is a situation that describes an observed feature (Aspect), an observed property (Modality), a sensor (Device) and method of sensing used and a value (RecordedData) observed for the property: that is, an observation (Record) describes a single value (RecordedData) attributed to a single property (Modality) by a particular sensor (Device)". In the BCI domain, it's common that some related observations occur immediately after an observation has ended,by changing some of its initial channeling or Device settings (parameters or conditions).Hence, it is desirable to keep a temporal tracking of the previous and following related observations.This is achieved via the hasPrevious and hasNext object properties. The logical data structure template of a Record is defined in its associated RecordChannelingSpec information object. General and consolidating data analytics for the whole raw data recordings can be associated for a Record with a set of related FeatureParameters. Additional relevant metadata can be extended via the object property hasMeasurementProperty.
See general remark about: 1_MAPPINGS-TO-SOSA-SSN
See general remark about: PROCEDURES
Following the structure presented in this example, the Record concept has been modeled to describe BCI observations, including the following object properties: [ [ [SOSA object property] [BCI object subproperty] [Between (from)...] [... and (to)] ] ] [ [sosa:hasFeatureOfInterest] [aspectOfInterest] [sosa:Observation (Record)] [sosa:FeatureOfInterest (Aspect)] ] [ [sosa:observedProperty] [observedModality] [sosa:Observation (Record)] [sosa:ObservableProperty (Modality)] ] [ [sosa:madeBySensor] [observedByDevice] [sosa:Observation (Record)] [sosa:Sensor (Device)] ] [ [ssn:wasOriginatedBy] [---] [sosa:Observation (Record)] [ssn:Stimulus (StimulusEvent)] ] [ [sosa:hasResult] [observationResult] [sosa:Observation (Record)] [oldssn:SensorOutput (RecordedData)] ] [ [Notion taken from --oldssn:isProducedBy--] [isProducedByDevice] [oldssn:SensorOutput (RecordedData)] [sosa:Sensor (Device)] ] ] ] Adequate restrictions have been designed accordingly for each object property.
[oldSSN] states that "an Observation is a description of the context, the Situation, in which the observation was made". In this ontology, the Context is directly related through the Session, which is a Situation where the Observation (Record) was made.
[oldSSN] Following the guidelines explained in (5.3.6 Observation) and (5.4.2 Smart product example -- 5.4.2.4 Observation), the main restrictions modeled for sosa:Observation (Record) are: (*) Exactly 1 sosa:FeatureOfInterest (Aspect): details what was sensed. (*) Exactly 1 sosa:ObservableProperty (Modality): details what was sensed. (*) Exactly 1 sosa:Sensor (Device): describes what made the Observation. (*) Some ssn:Stimulus (StimulusEvent): details (what made) the sosa:Sensor input. Other restrictions are: (*) Exactly 1 oldssn:Sensing (sub subclass of sosa:Procedure, it describes how the Observation was made): not adjusted for BCI activities.
BCI record (measurement record)
This class is intended to be the root of a class hierarchy. Domain-specific applications can extend this class hierarchy for their own purposes to describe any relevant type of BCI records generated by BCI devices for BCI activities.
1
1
1
1
Descriptor,Observations
Aspect-and-Modality_(SOSA-SSN).png
Descriptor_(SOSA-SSN).png
Record_(SOSA-SSN).png
2016-08-08T23:15:00
RecordChannelingSpec
Status: *STABLE*
Based on the adjusted settings of the DeviceChannelingSpec made for the observation, a Record has its own specific channeling schema information: that is, the logical components (Channels) and their extended generic metadata set that describe the Record's own logical data structure (specific to the observation and the Subject), according to the recording setup. A RecordChannelingSpec captures two information sets for a specific Record: (*) Its complete channeling schema description, in a form of an external document specification (outside the metadata repository). (*) Relevant metadata attributes regarding the general characteristics of the channeling schema: a set of related Channels. The structure described in a RecordChannelingSpec (first information set mentioned above) is based on a concrete subset of the DeviceChannelingSpec that the Device supports. Hence, for practical reasons, a RecordChannelingSpec is defined as a subclass of DeviceChannelingSpec.
The channeling schema information objects are structured in the following way: The Channeling Schema of a... (*) Modality (ChannelingSpec): it's the complete theoretical spec; the generic template for a specific sosa:ObservableProperty. (*) -------- Device (DeviceChannelingSpec): it's a functional subset of the ChannelingSpec; -------- defines the logical subset of the complete spec for the specific functionality of a oldssn:SensingDevice. (*) ---------------- Record (RecordChannelingSpec): it's the concrete subset of the DeviceChannelingSpec for a specific sosa:Observation. ---------------- This information object is user specific according to recording setup.
(*) A RecordChannelingSpec for an EegRecord would define the values of the positions for specific EegChannels used by the EegDevice, when the observation is made. A proper name for this spec would be EegRecordChannelingSpec. (*) For "Precision" Records, BCI applications may find important to keep the information regarding the Channel's coordinates).
Related to the RecordChannelingSpec, this concept has the following definitions: (*) [ESS 2.0]: a comma separated list of labels of the corresponding referenced channel or channels. This node is required for EEG Modality and it's used during EEG or ECG recording. For example, if using 10-20 system and numerical average of both mastoids, use "A1, A2" for {referenceLabel} and "Mastoids" for {referenceLocation}. Note that there could be multiple labels. (*) [XDF] Signal referencing scheme: name of the dedicated reference channel(s), if part of the measured channels (repeated if multiple). For an EEG channel label, its value is based on the labeling scheme. For EEG, the preferred labeling scheme is 10-20 (or the finer-grained 10-5).
record channeling schema spec
This class is intended to be the root of a class hierarchy. Domain-specific applications can extend this class hierarchy for their own purposes to describe any relevant type of a RecordChannelingSpec to capture the external information that defines the channeling schema information of a Record.
0
Descriptor,Observations
Descriptor_(SOSA-SSN).png
Record_(SOSA-SSN).png
2016-07-19T03:23:00
[XDF], [ESS]
RecordSpec
Status: *STABLE*
A RecordSpec is an information object that records (describes) specific properties (such as: specs of assistant materials, ambience settings, tools, etc.) regarding how a Record was made. Similarly to DeviceSpec, the structure of RecordSpec has been modeled as a composite object so that it can be composed as a set of RecordSpecs to describe specific parts on how a Record was made. In this way, a RecordSpec is considered as a bag of general, extended and descriptive properties about the Record settings. A RecordSpec is a specialized Descriptor. RecordSpecs can be used to record any other descriptive and extended information related to any settings or conditions on how the Record was made.
(*) The channeling schema of a Record is defined as an independent component from the RecordSpec. A Record's channeling schema (RecordChannelingSchema) is a subset of the DeviceChannelingSchema defined for its correspondent Device. (*) This ontology does not define any information object in particular of a RecordSpec.
Some BCI applications based on [XDF], find important to keep information regarding what assistant materials (hardware) and how they were used when the Record was made. Hence, a BCI application could define a classification for different type of specifications, such as: (*) Hardware specs for EegRecords: (*) Coupling (source [XDF]): type of coupling used (e.g. Capacitive, Dry, Gel, Saline). (*) Surface (source [XDF]): type of the contact surface (e.g. Bristle, Pad, Pins, Plate).
record specification
This class is intended to be the root of a class hierarchy. Domain-specific applications can extend this class hierarchy for their own purposes to describe specific types of RecordSpec.
1
1
0
1
1
1
Observations,Results
Record_(SOSA-SSN).png
RecordedData_(SOSA-SSN).png
2018-02-10T03:24:00
[oldSSN], [SSN]
RecordedData
Status: *STABLE*
[oldSSN] It is a specific type of a oldssn:SensorOutput (aligned to sosa:Result), which is a piece of information outputted by a Device in an sosa:Observation: an observed value for BCI activities. Based on [oldSSN], the value itself is being represented by a specific type of an oldssn:ObservationValue: a sequence of DataBlocks. This concept abstracts a raw data set (independent of its representation and access method) outputted by a Device for a specific Modality. In this way, a RecordedData has: (*) A single data representation, a DataFormat, and (*) multiple AccessMethods (either archived or in real-time). In the BCI domain, it's common that the data "evolves" over time. That is, there are changes on the data structure: (*) from its "initial" state (ever since it's collected from a Device: raw data) (*) to "following" states (when applying specialized algorithms to recognizes patterns throughout Models: transformed data). For example, for EEG data its evolution over time resembles a tree structure. Hence, it is desirable for BCI applications to keep a temporal tracking of the previous and following versions of the data: a derived data tree throughout keeping links between data versions. This is achieved via the hasPrevious and hasNext object properties.
See general remark about: 1_MAPPINGS-TO-SOSA-SSN
From the perspective of the data (RecordedData): (*) Its physical structure is defined through the DataFormat. (*) Its logical structure is defined through the associated RecordChannelingSpec of its Record (the defined collection of Channels).
In [ESS 1.0]: (*) Corresponds to the "eegRecordings" node: a specific collection of raw BCI data collected from a subject in a specific session. In [ESS 2.0]: (*) Corresponds to the "dataRecordings" node: information about EEG (or other data modality) recordings.
recorded BCI data
1
1
1
AnnotationTag
Marker_(SOSA-SSN).png
2018-01-02T03:01:00
ResponseTag
Status: *STABLE*
Information object that captures a Marker issued by a Model. In the BCI domain, a natural (physiological or neurological) change in the Subject's state while doing an Activity, is simply called a State. A ResponseTag could not be directly linked to a change in the Context (induced by a Context.Event, specifically a StimulusEvent) in a Session. A ResponseTag represents "something" detected by a Machine Learning Model (Model). Its content is represented by a set of related FeatureParameters.
Right after a set of DataSegments is created, two consecutive tasks occur: (*) A set of editing processes is run on the recordings, which classify the data sets based on different Models. (*) A set of ResponseTags ;is created, which capture the information related to "what is so special about" a particular DataSegment.
This is one of the most important concepts in this ontology.
In a common M2M semantic search query, the following input parameters may be used to retrieve a set of ResponseTags related to its DataSegments: (*) Activity: for example, driving. (*) Aspect: for example, vigilance and alert. (*) Modality: for example, EEG, EOG. (*) Subject: filtered by gender, age, etc.
state (response tag)
SystemCapabilities
SamplingRate_(SOSA-SSN).png
2017-08-30T23:52:00
[ESS], [XDF]
SamplingRate
Status: *STABLE*
Sampling rate of the Device. Its measurement unit is Hertz (Hz). As a relevant non-channeling measurement property (ssn-system:SystemProperty) related to a Device its modeling is based on: (*) the ssn-system:SystemCapability and ssn-system:SystemProperty concepts, and (*) the guidelines found in (5.4.2 Smart product example -- 5.4.2.2 Sensor).
See general remark about: UNITS-OF-MEASUREMENT
Related concept for a Record: hasSamplingRate.
sampling rate of a device
0
0
1
0
0
1
1
1
1
1
1
1
1
Session
Session.png
2018-04-17T22:03:00
[ESS], [SSN]
Session
Status: *STABLE*
A Session monitors how one Subject interacts with one Context while performing one Activity, throughout collecting a nonempty set of multimodal biomedical sosa:Observations (Records) and/or sosa:Actuations (Actuations).A Session has the following characteristics and restrictions: (*) Comprises a collection of multimodal BCI data capture tasks (each one with its own specific measurement purpose: Aspect). (*) Monitors exactly one Subject. (*) Monitors exactly one Activity (performed by the Subject). (*) Monitors exactly one Context (while the Subject interacts with it). (*) Comprises exactly one Playout collected from the associated Context. (*) Groups different and multiple Records (multimodal data) that are observed (collected) simultaneously from the Subject. (*) May group different and multiple Actuations simultaneously from the Subject. [oldSSN]: The concept of Session defines an appropiate structure to group multiple Records (multimodal data), based on the following description: "Observations (Records) of multiple features (Aspects) or multiple properties (Modality-ies) of the one feature should be represented as either compound properties, features and values or as multiple observations, grouped in some appropriate structure". (A similar depiction applies to Actuation). In the BCI research domain, a Session can have multipurpose extended metadata sets to describe broader concepts and definitions regarding the nature and purpose of this information object. These external metadata sets can be associated with Descriptors.
See general remark about: PROCEDURES
External descriptions that complement and extend the information about a Session, can be added through Descriptors. BCI applications based on [ESS 1.0] could define Descriptors to include information, such as: (*) Lab. ID. -- identifier of the session used in the original lab notes (if available, otherwise insert 'NA'). (*) Task Label. -- indicates which task is being performed in the session (e.g. A, B, C,...). Only use this node if there are different tasks. Otherwise leave the node blank. (*) For Session entities: If different tasks occur in the same session repeat the session node with a different "taskLabel", and other information that may be different, such as the "eegRecording" node. (*) For EventCode entities: Use this only if there are multiple tasks in the study and they use the same event codes, otherwise leave blank.
In [ESS 2.0], a Session is related to the RecordedParameterSet concept, found in the following XML element: (../recordingParameterSet/recordingParameterSetLabel). The "recordingParameterSet" node groups the information of multiple Modality-ies and, also, DeviceSpecs.Hence, it implies that a Session (a Record set) is associated (used) with multiple Devices, due of the "dataRecording" node definition in ESS (multiple RecordedData).
BCI session
1
Context,Observations
StimulusEvent_(SOSA-SSN).png
=== ** HED class in EXAMPLE SPEC ** [ [ [This class is intended to be the root of a class hierarchy. Domain-specific applications can extend this class hierarchy for their own purposes to describe any XXX] [ (*) HED.Auditory (*) HED.Pain (*) HED.Smell (*) HED.TMS (*) HED.Tactile (*) HED.Taste (*) HED.Visual ] [Defines a general bStimuli HED.] [HED.Stimulus] [HED.Stimulus_Visual.png] [[HED]] ] ** HED attribute for the HED class in EXAMPLE SPEC ** [ [ Event code tag, based on Hierarchical Event Descriptor (HED) Tags for Analysis of Event-Related EEG Studies document, (if available, otherwise leave blank).] [ ] ===
2018-04-11T13:08:00
[SSN]
StimulusEvent
Status: *STABLE*
A StimulusEvent describes an event that triggers a stimulus to the Subject during a Session. By its own nature, it may affect the Subject's performance of the Activity (and, therefore, a set of Actions related to the Activity). A StimulusEvent is an external happening on a specific Context that generates the input for the sensors ([SSN] concepts). Contextually, these events are raised (raises) by a set of interacting Context.Objects through a Context.Method. In [SSN], this concept is a subclass of ssn:Stimulus and, therefore, a oldssn:SensorInput, which describes the (data) input for the sosa:Sensors. In the BCI domain, this concept is simply called an "Event": the stimuli or trigger that causes the relevant measurement to be, in fact, processed or analyzed.
The following descriptions capture the definition of this concept ([oldSSN: Stimulus], 5.3.1.2.1 Stimuli) adjusted to this ontology: (*) StimulusEvents are detectable changes in the environment (Context), i.e., in the physical or a virtual world. (*) A StimulusEvent is an triggers" the Device. (*) StimulusEvents can either be directly or indirectly related to observable Modality-ies and, therefore, to Aspects. (*) The same types of StimulusEvents can trigger different kinds of Devices and be used to reason about different Modality-ies. (*) The Modality-ies associated to the StimulusEvent may be different to eventual observed (Record) Modality. (*) It is the StimulusEvent, not the Context.Object that triggers the Device. (*) A StimulusEvent may only be usable as proxy for a specific region of an observed Modality.
A StimulusEvent describes a specific component of the Context that "generates" an annotation tag (StimulusTag). Some examples of a StimulusEvent are: (*) Red light for 15 seconds at a 66 Hz. frequency. (*) Green light for 10 seconds at a 74 Hz. frequency.
stimulus event
This class is intended to be the root of a class hierarchy. Domain-specific applications can extend this class hierarchy for their own purposes to describe any relevant contextual event that describes the stimuli or trigger that causes a relevant BCI measurement.
1
AnnotationTag
Marker_(SOSA-SSN).png
2016-05-20T18:48:00
StimulusTag
Status: *STABLE*
Information object that captures a Marker induced by a StimulusEvent. While doing the data recording, the system automatically creates a Marker (StimulusTag) for the oldssn:SensorInput based on the Context (issued by a Context.Event, specifically a StimulusEvent) of the Session.
This is one of the most important concepts in this ontology.
stimulus tag
0
1
1
0
1
Subject
Subject.png
Context.png
2018-06-05T23:49:00
[ESS], [XDF]
Subject
Status: *STABLE*
A specific physical (natural) person (probably anonymous but possessing unique identity) with certain attributes on which the Sessions are recorded (from which the data is observed: Record). A Subject interacts with a Context through her Actions. "Subject" comes from the common terminology used in BCI experiments when referring to a specific person. This concept is based on the notion of Electronic Patient/Medical Record, such as the HL7 Record. Information objects related to this concept (namely SubjectState), capture the description of the Medical and Physiological "Condition" of a Subject in a Session. Thus, a Subject may have multiple Descriptors associated with it, such as HL7 Records or specific XML vocabularies from the industry. This ontology does not define any specific set of attributes associated to a Subject. BCI applications can extend this concept according to their information needs and system requirements.
The subject is the point of reference (focus) of the data monitoring and data analysis, from which BCI applications collect Measurement Recordings. Hence, the name Subject instead of Person.
[ESS] and [XDF] define some useful data type properties (attributes) associated to a Subject. Some examples of these attributes are: (*) Gender: (*) Defined as an enumerated value = { Female, Male, ... }. (*) It can be derived as a subproperty extended from the (dbp:Person).sex property definition. (*) Year of birth (YOB): (*) Defined as a positive integer greater or equal than 1900. (*) It can be derived as a subproperty extended from the dbp:Person definition. (*) Handedness: (*) Defined as an enumerated value = { Ambidextrous, Left, N/A, Right }. (*) Subject's dominantly used hand. Related to medical record. (*) Hearing: (*) Defined as an enumerated value = { CorrectedToNormal, Impaired, Normal }. (*) Subject's hearing condition. Related to medical record. (*) Vision: (*) Defined as an enumerated value = { CorrectedToNormal, Impaired, Normal }. (*) Subject's vision condition. Related to medical record.
person
Descriptor,Session,Subject
Descriptor_(SOSA-SSN).png
Session.png
2016-06-29T05:00:00
[ESS], [HED]
SubjectState
Status: *STABLE*
Describes the state of the Subject during the Session, throughout a collection of external specifications which capture extended metadata of the Subject's overall state. A state can be further classified properly to document more accurately the nature of the metadata (such as physiological state, cognitive state or emotional state). The nature of this concept is "transient" and depends directly on the Session: it is considered as an extended collection of metadata related to the Session that captures the overall state of the Subject during the data recording. A SubjectState is, itself, a specialized Descriptor that may have multiple Descriptors associated with, which describe extended metadata sets such as the HL7 Record.
Some examples of SubjectState may include descriptions regarding: (*) Physiological state: (*) [ESS] Age: Subject's age (in years) at the time of the Session. (*) [ESS] Height: Subject's height in centimeters (at the moment of the Session). (*) [ESS] Weight: Subject's weight in kilograms (at the moment of the Session). (*) [ESS] Hearing: Subject's hearing (e.g. "CorrectedToNormal", "Impaired", "Normal"). (*) [ESS] Vision: Subject's vision (e.g. "CorrectedToNormal", "Impaired", "Normal"). (*) [ESS] Caffeine: number of hours since last caffeine intake, if less than 12 hours. (*) [ESS] Alcohol: whether the Subject has consumed alcohol within 24 hours before the Session (a logical value). (*) [ESS] Medication: specification of the medication intake based on different parameters (time, chemical compounds, etc.). (*) Drowsiness: identified in [HED 1.31] as "awake". (*) Stress level. (*) [HED 1.31] Emotional state: (*) Alertness. Some additional metadata related to this concept used for research purposes could be: (*) A set of attributes to label the identity of the Subject in the Session. Example: (*) [ESS] A Lab. ID as a de-personalized Subject identifier in the research lab. (*) [ESS] A sequential ID to identify the Subject in a collection of Sessions. (Case: "InSessionNumber" attribute in [ESS 2.0]). (*) [ESS] An attribute to identify the group type that the Subject belongs to based on the research nature of the Sessions. Example: a Session Group to identify the Subject's group (e.g. "Autistic", "Normal", "Control", etc.).
Subject's state during a specific session
%APPLICATION%@cerebratek_nupod
This class is intended to be the root of a class hierarchy. Domain-specific applications can extend this class hierarchy for their own purposes to describe the different types of SubjectStates that can be found in a Session.
Observations
RecordedData_(SOSA-SSN).png
The XDF DataFormat.
2016-06-03T02:16:00
[XDF]
XdfDataFormat
Status: *STABLE*
Represents a XDF DataFormat. XDF is a general-purpose container format for multi-channel time series data with extensive associated meta-information. XDF is tailored towards biosignal data such as EEG, EMG, EOG, ECG, GSR, MEG, etc.
XDF EEG data format
(general)
2017-12-11T01:49:00
https://w3id.org/BCI-ontology#
Status: *STABLE*
Instance that identifies the BCI Ontology (BCI-O) as a vocabulary used in the linked data cloud. Its identifier corresponds to the BCI-O namespace URI.
bci: