http://purl.org/np/RAddoAfDWi1XRY6UGp_7Kh8ft_2omnURtnGI3qZ3miiLI#Head http://purl.org/np/RAddoAfDWi1XRY6UGp_7Kh8ft_2omnURtnGI3qZ3miiLI http://www.nanopub.org/nschema#hasAssertion http://purl.org/np/RAddoAfDWi1XRY6UGp_7Kh8ft_2omnURtnGI3qZ3miiLI#assertion http://purl.org/np/RAddoAfDWi1XRY6UGp_7Kh8ft_2omnURtnGI3qZ3miiLI http://www.nanopub.org/nschema#hasProvenance http://purl.org/np/RAddoAfDWi1XRY6UGp_7Kh8ft_2omnURtnGI3qZ3miiLI#provenance http://purl.org/np/RAddoAfDWi1XRY6UGp_7Kh8ft_2omnURtnGI3qZ3miiLI http://www.nanopub.org/nschema#hasPublicationInfo http://purl.org/np/RAddoAfDWi1XRY6UGp_7Kh8ft_2omnURtnGI3qZ3miiLI#pubinfo http://purl.org/np/RAddoAfDWi1XRY6UGp_7Kh8ft_2omnURtnGI3qZ3miiLI http://www.w3.org/1999/02/22-rdf-syntax-ns#type http://www.nanopub.org/nschema#Nanopublication http://purl.org/np/RAddoAfDWi1XRY6UGp_7Kh8ft_2omnURtnGI3qZ3miiLI#assertion http://purl.org/np/RAddoAfDWi1XRY6UGp_7Kh8ft_2omnURtnGI3qZ3miiLI#paragraph http://purl.org/spar/c4o/hasContent The proposed system is context-aware and combines the information extracted by the SCN with logic rules and knowledge of what the camera observes, building information and events that may occurred. Our system differs from other computer vision systems mainly by three factors. First, no images are sent, the smart cameras extract the knowledge from the images and then this knowledge is sent to the central unit (for more details see Section 6.1). Second, the WiseNET system combines context information with the camera information to improve the detections, in this way, it can overcome missed detections or non-detectable in- formation (e.g., people out of sight of a camera). Third, the system uses an ontology to fusion the different kinds of information presented in a Panoptes building, such as: information of the environment, time, smart cameras, events, detectable objects, etc. http://purl.org/np/RAddoAfDWi1XRY6UGp_7Kh8ft_2omnURtnGI3qZ3miiLI#paragraph http://www.w3.org/1999/02/22-rdf-syntax-ns#type http://purl.org/spar/doco/Paragraph http://purl.org/np/RAddoAfDWi1XRY6UGp_7Kh8ft_2omnURtnGI3qZ3miiLI#provenance http://purl.org/np/RAddoAfDWi1XRY6UGp_7Kh8ft_2omnURtnGI3qZ3miiLI#assertion http://www.w3.org/ns/prov#hadPrimarySource http://dx.doi.org/10.3233/SW-180298 http://purl.org/np/RAddoAfDWi1XRY6UGp_7Kh8ft_2omnURtnGI3qZ3miiLI#assertion http://www.w3.org/ns/prov#wasAttributedTo https://orcid.org/0000-0002-8429-8208 http://purl.org/np/RAddoAfDWi1XRY6UGp_7Kh8ft_2omnURtnGI3qZ3miiLI#pubinfo http://purl.org/np/RAddoAfDWi1XRY6UGp_7Kh8ft_2omnURtnGI3qZ3miiLI http://purl.org/dc/terms/created 2019-11-08T18:05:11+01:00 http://purl.org/np/RAddoAfDWi1XRY6UGp_7Kh8ft_2omnURtnGI3qZ3miiLI http://purl.org/pav/createdBy https://orcid.org/0000-0002-7114-6459