I understand a systematic error to you is a mathematical ghost which in your mind proves nothing can be calculated to prove a trend.
Explain how the actual temperature can fall outside the range of high and low temperature of a day. That's only possible if the measurements are wrong. What if the person gets drunk enough one day to make a 10*C mistake. Is he going to make it everyday for the rest of his life? Why wouldn't averaging mask the mistake enough to still see a future trend?
Your systematic error only exists in an imagery world. Prove it isn't changed when the system is changed!
I never said the actual temperature fell outside of the High/Low range, but how the average temperature
is averaged can affect what is called the average.
The Las Vegas example the High was 104F the Low was 80 F, the High/Low average was 92 F,
but the hourly average was 93.52 F, this is a delta of +.844 C, from simply how the data was averaged.
Since the GISS does not record how the data was averaged, but knows both methods are still in use,
the error range could be close to the total observed warming.
Consider if in 1900, we had 1000 stations doing Hi/Low averaging, after 1980, the weather service began upgrading stations to
record hourly measurements, and their budget allowed 50 stations a year to be upgraded.
Each year 50 additional stations would be reporting the systematic error increased average temperature.
Over 20 years the entire 1000 stations would be reporting the hourly average, and the entire data set would
show an artificial warming of ~.7 C. It really doesn't matter that the hourly average is more accurate,
but that it is being compared to the Hi/Low average from earlier.