Problem with tabular data
-
- Member
- Posts: 30
- Joined: Sun Nov 23, 2003 11:28 pm
- Location: Millthorpe, NSW, Australia
- Contact:
Problem with tabular data
I thought I knew DA inside out, but I've just started trying to plot tabular data and discovered that perhaps I don't. Can anyone tell me what I'm doing wrong.
I have modified digatmos.fmt file to stipulate the columns in which data is found, placing the new template after the default ones provided, thus:
.
.
WXX=19
VIS=14
[metarTepper]
ICA=2
LAT=3
LON=4
WND=5
WNS=6
TMP=8
DWP=9
TIM=12
DAT=11
I then import a csv file, the first few lines of which are:
[metarTepper]
WMO_NO,ABBR,LAT,LON,WDIR,WSPD,GUST,TEMP_DB,DEW_PT,SITE_NO,DATEK,TIMEK
95214,WYND,-15.51,128.15,120,25.9,35.2,34.5,-0.2,1006,20080505,1500
94102,TROU,-13.75,126.15,70,33.4,38.9,31.3,16.9,1007,20080505,1500
94100,KALU,-14.3,126.65,100,16.7,27.8,33.3,1.2,1019,20080505,1500
95101,TRUS,-14.09,126.39,70,14.8,22.2,32.8,6.9,1020,20080505,1500
94212,HALL,-18.23,127.66,110,24.1,33.4,32,-7.2,2012,20080505,1500
94216,KUNU,-15.78,128.71,110,27.8,38.9,33.2,-4.3,2056,20080505,1500
94217,ARGY,-16.64,128.45,120,24.1,38.9,33.7,-4.9,2064,20080505,1500
.
.
The file shows up when I look at it using File > View raw data. But when I try to plot it on a map of Australia (or the world) I get nothing. The status line indicates Observations: 0 total, 0 in domain, 0 plotted.
Interestingly, if I import an Australian Bureau of Meteorology axf data file, for which the sample templates already in digatmos.fmt were made, and then plot them, they do plot though the status line indicates Observations: 0 total, 0 in domain, 450 plotted. I cannot pick any difference, except the data, between my tabular file/digatmos.fmt template and the Bureau's, yet one works and the other doesn't.
Any ideas would be appreciated.
I have modified digatmos.fmt file to stipulate the columns in which data is found, placing the new template after the default ones provided, thus:
.
.
WXX=19
VIS=14
[metarTepper]
ICA=2
LAT=3
LON=4
WND=5
WNS=6
TMP=8
DWP=9
TIM=12
DAT=11
I then import a csv file, the first few lines of which are:
[metarTepper]
WMO_NO,ABBR,LAT,LON,WDIR,WSPD,GUST,TEMP_DB,DEW_PT,SITE_NO,DATEK,TIMEK
95214,WYND,-15.51,128.15,120,25.9,35.2,34.5,-0.2,1006,20080505,1500
94102,TROU,-13.75,126.15,70,33.4,38.9,31.3,16.9,1007,20080505,1500
94100,KALU,-14.3,126.65,100,16.7,27.8,33.3,1.2,1019,20080505,1500
95101,TRUS,-14.09,126.39,70,14.8,22.2,32.8,6.9,1020,20080505,1500
94212,HALL,-18.23,127.66,110,24.1,33.4,32,-7.2,2012,20080505,1500
94216,KUNU,-15.78,128.71,110,27.8,38.9,33.2,-4.3,2056,20080505,1500
94217,ARGY,-16.64,128.45,120,24.1,38.9,33.7,-4.9,2064,20080505,1500
.
.
The file shows up when I look at it using File > View raw data. But when I try to plot it on a map of Australia (or the world) I get nothing. The status line indicates Observations: 0 total, 0 in domain, 0 plotted.
Interestingly, if I import an Australian Bureau of Meteorology axf data file, for which the sample templates already in digatmos.fmt were made, and then plot them, they do plot though the status line indicates Observations: 0 total, 0 in domain, 450 plotted. I cannot pick any difference, except the data, between my tabular file/digatmos.fmt template and the Bureau's, yet one works and the other doesn't.
Any ideas would be appreciated.
-
- Senior Member
- Posts: 101
- Joined: Sun Nov 30, 2003 5:05 am
- Location: Fort Worth, TX
Re: Problem with tabular data
While I have never used this format, your qustion intrigued me so I did a little experimenting. The 1st few lines from "METAR from Australia BoM (AXF format via FTP):
[metar2Data]
ID_num, ID_name[6], Date, Time, Lat, Lon, Wdir, Wspd, T_DB, DP, QNH, RF9am, RF10m, Vis, AVis, Gust, Wx1Int, Wx1Dsc, Wx1Wx1, Wx1Wx2, Wx1Wx3, Wx2Int, Wx2Dsc, Wx2Wx1, Wx2Wx2, Wx2Wx3, Cld1Amt, Cld1Typ, Cld1Base, Cld2Amt, Cld2Typ, Cld2Base, Cld3Amt, Cld3Typ, Cld3Base, Cld4Amt, Cld4Typ, Cld4Base, Ceil1Amt, Ceil1Base, Ceil2Amt, Ceil2Base, Ceil3Amt, Ceil3Base
94300, "YCAR ", 20121115, 1320, -24.89, 113.67, 190, 18.0, 21.3, 17.2, 1014.6, 0.0, 0.0, -9999, 10000, 22.0, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, 11, 390, -9999, -9999, -9999, -9999
94406, "YBLB ", 20121115, 1330, -26.82, 114.61, -9999, -9999.0, -9999.0, -9999.0, 1015.1, -9999.0, -9999.0, -9999, -9999, -9999.0, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999
94647, "ECLT ", 20121115, 1318, -31.68, 128.88, 140, 11.0, 16.7, 12.8, 1018.7, 0.0, 0.0, -9999, 10000, 17.0, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, 8, 480, -9999, -9999, -9999, -9999
and the 1st few lines from your converted file:
[metarTepper]
WMO_NO,ABBR,LAT,LON,WDIR,WSPD,GUST,TEMP_DB,DEW_PT,SITE_NO,DATEK,TIMEK
95214,WYND,-15.51,128.15,120,25.9,35.2,34.5,-0.2,1006,20080505,1500
94102,TROU,-13.75,126.15,70,33.4,38.9,31.3,16.9,1007,20080505,1500
94100,KALU,-14.3,126.65,100,16.7,27.8,33.3,1.2,1019,20080505,1500
95101,TRUS,-14.09,126.39,70,14.8,22.2,32.8,6.9,1020,20080505,1500
94212,HALL,-18.23,127.66,110,24.1,33.4,32,-7.2,2012,20080505,1500
94216,KUNU,-15.78,128.71,110,27.8,38.9,33.2,-4.3,2056,20080505,1500
94217,ARGY,-16.64,128.45,120,24.1,38.9,33.7,-4.9,2064,20080505,1500
To me, the problem is evident, that the file formats are not the same. A CSV (Coma Seperated Values) seperates the data by a "coma" no matter how much or little the data is. I can not find the file format specification for .AXF, but did find one reference that it was a SQL database format. I'm thinking you will need to find out the file format specification and then you can use your file.
[metar2Data]
ID_num, ID_name[6], Date, Time, Lat, Lon, Wdir, Wspd, T_DB, DP, QNH, RF9am, RF10m, Vis, AVis, Gust, Wx1Int, Wx1Dsc, Wx1Wx1, Wx1Wx2, Wx1Wx3, Wx2Int, Wx2Dsc, Wx2Wx1, Wx2Wx2, Wx2Wx3, Cld1Amt, Cld1Typ, Cld1Base, Cld2Amt, Cld2Typ, Cld2Base, Cld3Amt, Cld3Typ, Cld3Base, Cld4Amt, Cld4Typ, Cld4Base, Ceil1Amt, Ceil1Base, Ceil2Amt, Ceil2Base, Ceil3Amt, Ceil3Base
94300, "YCAR ", 20121115, 1320, -24.89, 113.67, 190, 18.0, 21.3, 17.2, 1014.6, 0.0, 0.0, -9999, 10000, 22.0, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, 11, 390, -9999, -9999, -9999, -9999
94406, "YBLB ", 20121115, 1330, -26.82, 114.61, -9999, -9999.0, -9999.0, -9999.0, 1015.1, -9999.0, -9999.0, -9999, -9999, -9999.0, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999
94647, "ECLT ", 20121115, 1318, -31.68, 128.88, 140, 11.0, 16.7, 12.8, 1018.7, 0.0, 0.0, -9999, 10000, 17.0, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, 8, 480, -9999, -9999, -9999, -9999
and the 1st few lines from your converted file:
[metarTepper]
WMO_NO,ABBR,LAT,LON,WDIR,WSPD,GUST,TEMP_DB,DEW_PT,SITE_NO,DATEK,TIMEK
95214,WYND,-15.51,128.15,120,25.9,35.2,34.5,-0.2,1006,20080505,1500
94102,TROU,-13.75,126.15,70,33.4,38.9,31.3,16.9,1007,20080505,1500
94100,KALU,-14.3,126.65,100,16.7,27.8,33.3,1.2,1019,20080505,1500
95101,TRUS,-14.09,126.39,70,14.8,22.2,32.8,6.9,1020,20080505,1500
94212,HALL,-18.23,127.66,110,24.1,33.4,32,-7.2,2012,20080505,1500
94216,KUNU,-15.78,128.71,110,27.8,38.9,33.2,-4.3,2056,20080505,1500
94217,ARGY,-16.64,128.45,120,24.1,38.9,33.7,-4.9,2064,20080505,1500
To me, the problem is evident, that the file formats are not the same. A CSV (Coma Seperated Values) seperates the data by a "coma" no matter how much or little the data is. I can not find the file format specification for .AXF, but did find one reference that it was a SQL database format. I'm thinking you will need to find out the file format specification and then you can use your file.
-
- Member
- Posts: 30
- Joined: Sun Nov 23, 2003 11:28 pm
- Location: Millthorpe, NSW, Australia
- Contact:
Re: Problem with tabular data
Thanks for a quick reply, Greg. The axf format seems to be peculiar to the Australian Bureau of Meteorology, but it may indeed be some form of sql output.
I had been following the rules set down by Tim in the DA Manual Appendix, basically all fields to be separated by a comma. The axf file follows this rule, with the addition of a space following each comma. I'll do some experimenting and post back.
I had been following the rules set down by Tim in the DA Manual Appendix, basically all fields to be separated by a comma. The axf file follows this rule, with the addition of a space following each comma. I'll do some experimenting and post back.
-
- Member
- Posts: 30
- Joined: Sun Nov 23, 2003 11:28 pm
- Location: Millthorpe, NSW, Australia
- Contact:
Re: Problem with tabular data
No, that didn't fix it. Even quoting the four-letter station id as the Bureau does doesn't change it. Both files are plain comma-delimited flatfiles and that is what the manual says it wants. I'm beginning to think that the problem may lie with the digatmos.fmt file.
-
- Member
- Posts: 30
- Joined: Sun Nov 23, 2003 11:28 pm
- Location: Millthorpe, NSW, Australia
- Contact:
Re: Problem with tabular data
Can anyone throw light on this subject? Surely someone is using the formatted data input method of DA, with or without success.
-
- Senior Member
- Posts: 101
- Joined: Sun Nov 30, 2003 5:05 am
- Location: Fort Worth, TX
Re: Problem with tabular data
Do your headers have the space after the coma too? Look at the 2 examples. Grasping here..........
-
- Member
- Posts: 30
- Joined: Sun Nov 23, 2003 11:28 pm
- Location: Millthorpe, NSW, Australia
- Contact:
Re: Problem with tabular data
No.
Here's the first few lines of an Australian Bureau of Meteorology axf file that does work:
[metar2Data]
ID_num, ID_name[6], Date, Time, Lat, Lon, Wdir, Wspd, T_DB, DP, QNH, RF9am, RF10m, Vis, AVis, Gust, Wx1Int, Wx1Dsc, Wx1Wx1, Wx1Wx2, Wx1Wx3, Wx2Int, Wx2Dsc, Wx2Wx1, Wx2Wx2, Wx2Wx3, Cld1Amt, Cld1Typ, Cld1Base, Cld2Amt, Cld2Typ, Cld2Base, Cld3Amt, Cld3Typ, Cld3Base, Cld4Amt, Cld4Typ, Cld4Base, Ceil1Amt, Ceil1Base, Ceil2Amt, Ceil2Base, Ceil3Amt, Ceil3Base
94406, "YBLB ", 20121120, 1000, -26.82, 114.61, -9999, -9999.0, -9999.0, -9999.0, 1008.6, -9999.0, -9999.0, -9999, -9999, -9999.0, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999
95645, "YCAG ", 20121120, 1000, -32.27, 125.49, -9999, -9999.0, -9999.0, -9999.0, 1018.7, -9999.0, -9999.0, -9999, -9999, -9999.0, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999
and the first few lines of my (amended) csv file that doesn't:
[metarTepper]
WMO_NO, ABBR, LAT, LON, WDIR, WSPD, GUST, TEMP_DB, DEW_PT, SITE_NO, DATEK, TIMEK
95214, "WYND", -15.51, 128.15, 120, 25.9, 35.2, 34.5, -0.2, 1006, 20080505, 1500
94102, "TROU", -13.75, 126.15, 70, 33.4, 38.9, 31.3, 16.9, 1007, 20080505, 1500
From what I can see, they are identical in all respects, right down to a trailing space after the last column of data.
Here's the first few lines of an Australian Bureau of Meteorology axf file that does work:
[metar2Data]
ID_num, ID_name[6], Date, Time, Lat, Lon, Wdir, Wspd, T_DB, DP, QNH, RF9am, RF10m, Vis, AVis, Gust, Wx1Int, Wx1Dsc, Wx1Wx1, Wx1Wx2, Wx1Wx3, Wx2Int, Wx2Dsc, Wx2Wx1, Wx2Wx2, Wx2Wx3, Cld1Amt, Cld1Typ, Cld1Base, Cld2Amt, Cld2Typ, Cld2Base, Cld3Amt, Cld3Typ, Cld3Base, Cld4Amt, Cld4Typ, Cld4Base, Ceil1Amt, Ceil1Base, Ceil2Amt, Ceil2Base, Ceil3Amt, Ceil3Base
94406, "YBLB ", 20121120, 1000, -26.82, 114.61, -9999, -9999.0, -9999.0, -9999.0, 1008.6, -9999.0, -9999.0, -9999, -9999, -9999.0, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999
95645, "YCAG ", 20121120, 1000, -32.27, 125.49, -9999, -9999.0, -9999.0, -9999.0, 1018.7, -9999.0, -9999.0, -9999, -9999, -9999.0, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999, -9999
and the first few lines of my (amended) csv file that doesn't:
[metarTepper]
WMO_NO, ABBR, LAT, LON, WDIR, WSPD, GUST, TEMP_DB, DEW_PT, SITE_NO, DATEK, TIMEK
95214, "WYND", -15.51, 128.15, 120, 25.9, 35.2, 34.5, -0.2, 1006, 20080505, 1500
94102, "TROU", -13.75, 126.15, 70, 33.4, 38.9, 31.3, 16.9, 1007, 20080505, 1500
From what I can see, they are identical in all respects, right down to a trailing space after the last column of data.
-
- Member
- Posts: 32
- Joined: Sat Nov 29, 2003 5:34 pm
- Location: Sydney Australia (down under)
Re: Problem with tabular data
Have you figured this problem out yet Laurier ?
If so, what was the "answer" ?
Just curious.
If so, what was the "answer" ?
Just curious.
-
- Member
- Posts: 30
- Joined: Sun Nov 23, 2003 11:28 pm
- Location: Millthorpe, NSW, Australia
- Contact:
Re: Problem with tabular data
No, Peter, unfortunately I gave up after several more attempts. It remains a mystery that perhaps will be resolved in the next upgrade of DA.