Please use this identifier to cite or link to this item:
http://hdl.handle.net/20.500.11861/6527
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Prof. LI Yi Man, Rita | en_US |
dc.contributor.author | Leung, Tat Ho | en_US |
dc.date.accessioned | 2021-03-07T09:11:59Z | - |
dc.date.available | 2021-03-07T09:11:59Z | - |
dc.date.issued | 2019 | - |
dc.identifier.citation | In Yang, X.S., Sherratt, S., Dey, N. & Joshi, A. (eds.) (2019). Fourth international congress on information and communication technology (pp. 17-22). | en_US |
dc.identifier.isbn | 9789813293427 | - |
dc.identifier.isbn | 9789813293434 | - |
dc.identifier.uri | http://hdl.handle.net/20.500.11861/6527 | - |
dc.description.abstract | Construction sites are among the most hazardous venues. While most of the previous research has shed light on the human aspect, we propose to utilise the fast R-CNN object detection method to detect the construction hazard on sites and employ mixed reality to enable the artificial intelligence to detect the hazard. Fast region-based convolutional neural network object detection acquires expert knowledge to identify objects in the image. Unlike image classification, the complexity of object detection always implies an increase in complexity which demands solutions with regard to speed, accuracy and simplicity. | en_US |
dc.language.iso | en | en_US |
dc.title | Computer vision and hybrid reality for construction safety risks: A pilot study | en_US |
dc.type | Conference Paper | en_US |
dc.relation.conference | 4th International Congress on Information and Communication Technology | en_US |
dc.identifier.doi | 10.1007/978-981-32-9343-4_2 | - |
item.fulltext | No Fulltext | - |
crisitem.author.dept | Department of Economics and Finance | - |
Appears in Collections: | Economics and Finance - Publication |
SCOPUSTM
Citations
7
checked on Nov 17, 2024
Page view(s)
123
Last Week
1
1
Last month
checked on Nov 21, 2024
Google ScholarTM
Impact Indices
Altmetric
PlumX
Metrics
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.