The data subset consists of a network of individual land stations that has been designated by the World Meteorological Organization for use in climate monitoring. The data show monthly average temperature values for over 1,500 land stations.But without the full list of stations, we can't verify if the station set is either correct or complete using their algorithm.
So, it's not the raw data. Which means we can't verify the adjustments being correctly made.
The unzipped database is 33mb in size. You could comfortably fit that on a mag tape.
Let's be clear about this from a data processing perspective: You don't delete raw data. You keep it because you can always then reconstruct things. Mag tapes weren't so expensive that someone couldn't afford to keep all the data on one.
Can someone explain this? We've got altered temperatures, yet no original data and no program code for the alterations. How was this peer-reviewed then?