海洋データセッ卜作成·管理に際して発生し易い誤りとその原因 -II. 岩手県水産技術センターの事例と重複データの取り扱い-
スポンサーリンク
概要
- 論文の詳細を見る
After oceanic data were archived by a data management agency such as JODC (Japan Oceanographic Data Center), even if questionable data are found, it is hard to send such data back to their originator for correction. They would not be eliminated from the dataset, but some error flag is put on them. However, it is desirable to minimize the number of questionable data. We investigated error sources which often happen in data processing, collection and storage processes, in order to find the way to improve the quality of data flowing into JODC/MIRC system. In the previous paper (Nagata et al., 1999), we analyzed dataset obtained by the Wakayama Research Center of Agriculture, Forestry and Fisheries (WRCAFF), and found that errors are mainly generated in punching processes. In this paper, we report the results of the analysis on the database of the Iwate Fisheries Technology Center (IFTC). The data quality in IFTC was much improved after 1970, as just as in WRCAFF. Many duplicated data were found in the database of IFTC. Main cause of the occurrence of duplication is that they make two kinds of dataset (Coastal Lines and Offshore Lines), and that the data obtained at some stations were sometimes sent to both of the dataset. Check of duplicated data is important in data management. We discuss techniques of duplication check by referring the case of IFTC.
- 海洋調査技術学会の論文
海洋調査技術学会 | 論文
- 2004年スマトラ・アンダマン地震震源近傍における潜航調査
- 現地利用型エネルギー活用をめざした沿岸風力照明システムの検討
- 海底における強震動観測のための加速度計搭載海底地震計の開発
- 海洋測位技術
- 浅海・汽水生態系の役割とその保全に関する我が国の取り組み