I'm reasoning about some time parsing thing in JavaScript which uses millisecond-precision for Date objects and Unix System time does use seconds.
Putting problems with precision loss for conversion- can one say if
if 10^9 seconds is a Gigasecond, then 10^12 Miliseconds is a Gigasecond a Teramilisecond?