





Temporal resolution of gas abundance and isotopic fractionation are indeed limited by diffusion through the unconsolidated snowpack, but other factors like precipitation (from annual layer depth) and summer temperature (melt layers) are not. Tree rings and varves also provide climate information at the annual scale.The video also fails to defend the hockey stick. It's of course silly to use the GISP2 data and point out that there is no 20th century uptick, but it's frankly just as silly to attach an instrument record to proxy records, because the temporal resolutions of proxies and instrument records are significantly different. For instance, ice core data has a sealing time of nearly a century.
I don't think he is so much trying to nail what it is as showing what it isn't. His argument is that the prediction of 1 foot based on linear extrapolation is too low, because the rate of rise will accelerate, and he presented research justifying the expectation of acceleration. IPCC AR5 WGI says that sea level rise by 2100 under RCP 8.5 is 0.74 +/- 0.23 meters, or between 1.7 and 3.2 feet. So in fact a claim of 1 foot by simple extrapolation is too low by 1.9 sigma, while 3 feet is within the 1 sigma uncertainty range.

I didn't say that the data was wrong, but you must be very careful about what the data is used to prove. You can't simply stick very different ways of measuring things to each other and highlight an abrupt change starting at the transition.
I don't think the emission scenario was specified. 1 foot is within RCP2.6. In any case, to give a figure in the upper range of the worst case scenario without stressing that this is what you do isn't a good rebuttal in a video targeting half truths.

Proxy data extend into the instrumental record, and they show an abrupt transition. It is not simply an artifact of slapping different kinds of measurements together, nor is it an artifact of calibrating proxies to instrumental temperatures.
Not explicitly, but I think it is pretty clear that both are in the context of business as usual.

I searched a bit, and I found one paper by Ljungqvist using proxies extending into the instrumental record, going 2000 years back. It's only for the extratropical NH and not all the proxies used cover the entire period, so it's not quite apples to apples, but the study seems pretty comprehensive. Yes, the proxies show the modern warming, but they fail to produce the hockey stick that you get by splicing proxies and instrumental observations. Let's look at Ljungqvist's data(temperature relative to the 1961-90 average): It doesn't mean that the instrumental records exaggerate the warming. I think the proxies are just sluggish. But it also means that any previous abrupt changes over the last 2000 years will also appear dampened in the proxy data. We don't really know if there were such abrupt changes.

Good approach and I understand your confusion. I think you come close to recognizing the misconception, and the solution is to think carefully about the implications of the underlined.I searched a bit, and I found one paper by Ljungqvist using proxies extending into the instrumental record, going 2000 years back. It's only for the extratropical NH and not all the proxies used cover the entire period, so it's not quite apples to apples, but the study seems pretty comprehensive. Yes, the proxies show the modern warming, but they fail to produce the hockey stick that you get by splicing proxies and instrumental observations


Sure. Your best reference for this and anything else regarding our knowledge of Holocene climate history is Chapter 5 of IPCC WGI AR5. The following figures will be the most relevant:
Correct, but it is also under-represented due to proxy dropout at the end of this interval.

Right, as I mentioned, the proxies don't all cover the entire period. Probably for different reasons, and it will be interesting to see how they track the instrumental record in the coming decades. The "divergence problem" (aka "hide the decline") is an interesting case.

True! But counter-intuitively, not as big of a problem as you might think. I have a fondness for counter-intuitive things.

The size of that problem varies. The basic problem is that you add layers of assumptions. Without sending probes we determine the masses of distant binary stars, and even figure out what kind of molecules they have in their atmospheres, which, when you think about it, is magic. Because we can be fairly confident about the assumptions made. How solid are the assumptions needed to measure how SH temperatures differ from NH temperatures in Finnish sediments? It sounds like a fair question to ask, if climateaudit got the underlying data right.
The Earth is a single system, so events in one hemisphere will affect the other. The attribution, however, is hard, and this indirect methodology is prone to confirmation bias, ad hoc reasoning and will easily miss the unexpected. And as I've said many times, climate science suffers from the compulsive thought that if observations are missing, there simply has to be a way to derive them otherwise. I think this is to work things out from the wrong end.
You can do the maths and figure out the upper resolution. The assumptions aren't that many. But a temperature proxy serving as another temperature proxy? Is it "settled science"?
I have the impression that the hockey stick has been significantly downplayed by the science (or IPCC) in recent years. Steve McIntyre apparently calls it spaghetti instead, but if that better shows differences, it's a more honest picture. How much warmer or colder was the SH in, say, the 6th century than the past century, and what was its variability? Well, we don't really know for sure. That seems to be the primary answer.