Look at the following code:

var x = new Date('2016-01-29T14:00:00');
console.log(x.getTime());

Running it in different browsers is surprising

1454076000000 // Chrome | FF | Opera
1454072400000 // IE | Edge

MDN says:

The getTime() method returns the numeric value corresponding to the time for the specified date according to universal time.

I repeat according to universal time!

ECMA Script Spec says:

Time is measured in ECMAScript in milliseconds since 01 January, 1970 UTC

Pretty accurate.

Let’s change the code slightly by appending a Z at the end of the string being parsed as date:

var x = new Date('2016-01-29T14:00:00Z');
console.log(x.getTime());

Here are the results:

1454076000000 // Chrome | FF | Opera
1454076000000 // IE | Edge

So it’s definitely NOT the getTime() part that lacks.

ECMA Script Spec says:

The String may be interpreted as a local time, a UTC time, or a time in some other time zone, depending on the contents of the String. The function first attempts to parse the format of the String according to the rules called out in Date Time String Format (15.9.1.15). If the String does not conform to that format the function may fall back to any implementation-specific heuristics or implementation-specific date formats.

Where the Date Time String Format (15.9.1.15) says

The value of an absent time zone offset is “Z”.

I repeat Z, which is universal time!

If only the IE/Edge team was able to read and understand the specs…

HTH,
Daniel