How do you specify that a class property is an integer?

TypeScript is a superset of JavaScript, which doesn't have a concept of an int. It only has the concept of a number, which has a floating point.

Generally speaking, the amount of work the compiler would have to do to enforce only whole numbers for a TypeScript int type could potentially be massive and in some cases it would still not be possible to ensure at compile time that only whole numbers would be assigned, which is why it isn't possible to reliably add an int to TypeScript.

When you initially get intelliSense in Visual Studio, it isn't possible for the tooling to determine what to supply, so you get everything, including int - but once you are dealing with something of a known type, you'll get sensible intelliSense.

Examples

var myInt: number;
var myString: string;

myInt. // toExponential, toFixed, toPrecision, toString
myString. // charAt, charCodeAt, concat, indexOf, lastIndexOf, length and many more...

There is no integer or float but number type in TypeScript like in JavaScript. But if you want tell programmer that you expect integer type you can try to use Type Aliases like

type integer = number;
type float = number;

// example:
function setInt(id: integer) {}

but this is still number type and you can get float.

Part of description from documentation:
"Aliasing doesn’t actually create a new type - it creates a new name to refer to that type. Aliasing a primitive is not terribly useful, though it can be used as a form of documentation."


In TypeScript you can approximate what is sometimes called an opaque type using a marker.

// Helper for generating Opaque types.
type Opaque<T, K> = T & { __opaque__: K };

// 2 opaque types created with the helper
type Int = Opaque<number, 'Int'>;
type ID = Opaque<number, 'ID'>;

// using our types to differentiate our properties even at runtime
// they are still just numbers
class Foo {
    someId: ID;
    someInt: Int;
}

let foo = new Foo();

// compiler won't let you do this due to or markers
foo.someId = 2;
foo.someInt = 1;

// when assigning, you have to cast to the specific type
// NOTE: This is not completely type safe as you can trick the compiler 
// with something like foo.someId = 1.45 as ID and it won't complain.
foo.someId = 2 as ID;
foo.someInt = 1 as Int;

// you can still consume as numbers
let sum: number = foo.someId + foo.someInt;

Doing this allow you to be more explicit in your code as to what types your properties expect, and the compiler won't allow you to assign a primitive value without a cast. This doesn't produce any additional .js output, and you can still consume and use the values as whatever types they are based on. In this example I'm using numbers, but you can use on strings and other types as well.

You can still trick the compiler into accepting something that isn't an Int or an Id in this example, but it should jump out if you were trying to assign 1.45 as Int or something like that. You also have the option of creating helper functions that you use to create your values to provide runtime validation.

There's a number of different ways you can create "marked" types. Here's a good article: https://michalzalecki.com/nominal-typing-in-typescript/


  1. I think there is not a direct way to specify whether a number is integer or floating point. In the TypeScript specification section 3.2.1 we can see:

    "...The Number primitive type corresponds to the similarly named JavaScript primitive type and represents double-precision 64-bit format IEEE 754 floating point values..."

  2. I think int is a bug in Visual Studio intelliSense. The correct is number.

Tags:

Typescript