Coder Social home page Coder Social logo

badrap / valita Goto Github PK

View Code? Open in Web Editor NEW
308.0 9.0 8.0 553 KB

A typesafe validation & parsing library for TypeScript.

License: MIT License

TypeScript 99.84% Dockerfile 0.16%
typescript input-validation javascript typesafe validation-library deno nodejs bun

valita's Introduction

@badrap/valita CI npm JSR

A TypeScript library for validating & parsing structured objects. The API is heavily influenced by Zod's excellent API, while the implementation side aims for the impressive performance of simple-runtypes.

const vehicle = v.union(
  v.object({ type: v.literal("plane"), airline: v.string() }),
  v.object({ type: v.literal("train") }),
  v.object({ type: v.literal("automobile"), make: v.string() }),
);
vehicle.parse({ type: "bike" });
// ValitaError: invalid_literal at .type (expected "plane", "train" or "automobile")

Note

While this package is still evolving, we're currently not accepting any new feature requests or suggestions. Please use the issue tracker for bug reports and security concerns, which we highly value and welcome. Thank you for your understanding ❤️

Goals and Non-Goals

Goals

  1. Input Validation & Parsing: The fundamental goal of the library is to ensure that incoming data, which might not be from a trusted source, aligns with the predetermined format.
  2. Minimalism: Deliver a streamlined and concentrated library that offers just the essentials.
  3. Extensibility: Allow users to create their own validators and parsers that cater to specific validation scenarios.

Non-Goals:

  1. Data Definition: The library is designed to validate and parse input data as it enters the program, rather than serving as an exhaustive tool for defining all types within the program after obtaining input.
  2. Extensive Built-In Formats: The library does not prioritize having a large array of built-in validation formats out of the box.
  3. Asynchronous Parsing: Asynchronous operations are outside the scope for this library.

Installation

For Node.js

npm i @badrap/valita

For Deno

deno add @badrap/valita

API Reference

This section contains an overview of all validation methods.

Primitive Types

Let's start with the basics! Like every validation library we support all primitive types like strings, numbers, booleans and more. For example the v.string() primitive can be used like this to check whether some input value is a string:

import * as v from "@badrap/valita";

const t = v.string();
t.parse("Hello, World!");
// "Hello, World!"

Try to parse anything that's not a string and you get an error:

t.parse(1);
// ValitaError: invalid_type at . (expected string)

.parse(...) is typed in a way that it accepts any type of input value, but returns a properly typed value on success:

const u: unknown = "Hello, World!";

// TypeScript type checking is happy with this!
const s: string = t.parse(u);

The primitive types are:

  • v.string(): Check that the value is a string.
  • v.number(): Check that the value is a number (i.e. typeof value === "number", which includes NaN and ±Infinity).
  • v.bigint(): Check that the value is a bigint (i.e. typeof value === "bigint")
  • v.boolean(): Check that the value is a boolean.
  • v.null(): Check that the value is null.
  • v.undefined(): Check that the value is undefined.

Literal Types

Sometimes knowing if a value is of a certain type is not enough. We can use the .literal() method to check for actual values, like checking if a string is either "red", "green" or "blue" and not just any string.

const rgb = v.union(v.literal("red"), v.literal("green"), v.literal("blue"));

rgb.parse("green");
// "green"

rgb.parse("magenta");
// ValitaError: invalid_literal at . (expected "red", "green" or "blue")

We can also use this to check for concrete numbers, bigint literals or boolean values:

v.literal(1); // must be number 1
v.literal(1n); // must be bigint 1n
v.literal(true); // must be true

For more complex values you can use the .assert()-method. Check out Custom Validators to learn more about it.

Allow Any / Forbid All

Valita doesn't contain a built-in equivalent to TypeScript's any type. However, v.unknown() is analogous to TypeScript's unknown type and can be used to accept any input value:

const u = v.unknown();

u.parse(1);
// 1

The inverse of v.unknown() is v.never() that fails for every value. This is analogous to TypeScript's never type.

const n = v.never();

n.parse(1);
// ValitaError: invalid_type at . (expected nothing)

By themselves v.unknown() and v.never() are not terribly useful, but they become more relevant with composite types such as Object Types.

Object Types

Validators can be further combined to larger, arbitrarily complex validators. One such combinator is v.object(...), used to check that the input value is an object that has some named properties, and that those properties have a specific type.

const o = v.object({
  company: v.string(),

  // Nested objects work fine too!
  address: v.object({
    city: v.string(),
    country: v.string(),
  }),
});

o.parse({
  name: "Acme Inc.",
  address: { city: "Springfield", country: "Freedomland" },
});
// {
//   name: "Acme Inc.",
//   address: { city: "Springfield", country: "Freedomland" },
// }

o.parse({ name: "Acme Inc." });
// ValitaError: missing_value at .address (missing value)
o.parse({
  name: "Acme Inc.",
  ceo: "Wiley E. Coyote",
  address: { city: "Springfield", country: "Freedomland" },
});
// ValitaError: unrecognized_keys at . (unrecognized key "ceo")

As seen above, unexpected keys like "ceo" are prohibited by default. That default can be changed with Parsing Modes.

Parsing Modes

By default v.object(...) throws an error when it encounters an object with unexpected keys. That behavior can be changed by explicitly passing a parsing mode to .parse(...):

const o = v.object({
  name: v.string(),
});

// Strip away the extra keys
o.parse({ name: "Acme Inc.", ceo: "Wiley E. Coyote" }, { mode: "strip" });
// { name: "Acme Inc." }

// Pass the extra keys through as-is
o.parse({ name: "Acme Inc.", ceo: "Wiley E. Coyote" }, { mode: "passthrough" });
// { name: "Acme Inc.", ceo: "Wiley E. Coyote" }

// Forbid extra keys. This is the default.
o.parse({ name: "Acme Inc.", ceo: "Wiley E. Coyote" }, { mode: "strict" });
// ValitaError: unrecognized_keys at . (unrecognized key "ceo")

The possible values are:

  • { mode: "strict" }: Forbid extra keys. This is the default.
  • { mode: "strip" }: Don't fail on extra keys - instead strip them away from the output object.
  • { mode: "passthrough" }: Just ignore the extra keys and pretend you didn't see them.

The parsing mode applies to all levels of your validation hierarcy, even to nested objects.

const o = v.object({
  company: v.object({
    name: v.string(),
  }),
});

o.parse(
  {
    company: { name: "Acme Inc.", ceo: "Wiley E. Coyote" },
    greeting: "Hello!",
  },
  { mode: "strip" },
);
// { company: { name: "Acme Inc." } }

Rest Properties & Records

Sometimes you may want to allow extra keys in addition to the defined keys. For that you can use .rest(...), and additionally require the extra keys to have a specific type of value:

const o = v
  .object({
    name: v.string(),
    age: v.number(),
  })
  .rest(v.string());

o.parse({ name: "Example McExampleface", age: 42, socks: "yellow" });
// { name: "Example McExampleface", age: 42, socks: "yellow" }

o.parse({ name: "Example McExampleface", age: 42, numberOfDogs: 2 });
// ValitaError: invalid_type at .numberOfDogs (expected string)

The .rest(...) method is also handy for allowing or forbidding extra keys for a specific parts of your object hierarchy, regardless of the parsing mode.

const lenient = v.object({}).rest(v.unknown()); // *Always* allow extra keys
lenient.parse({ socks: "yellow" }, { mode: "strict" });
// { socks: "yellow" }

const strict = v.object({}).rest(v.never()); // *Never* allow extra keys
strict.parse({ socks: "yellow" }, { mode: "strip" });
// ValitaError: invalid_type at .socks (expected nothing)

For always allowing a completely arbitrary number of properties, v.record(...) is shorthand for v.object({}).rest(...). This is analogous to the Record<string, ...> type in TypeScript.

const r = v.record(v.number());

r.parse({ a: 1, b: 2 });
// { a: 1, b: 2 }

r.parse({ a: 1, b: "hello" });
// ValitaError: invalid_type at .b (expected number)

Optional Properties

One common API pattern is that some object fields are optional, i.e. they can be missing completely or be set to undefined. You can allow some keys to be missing by annotating them with .optional().

const person = v.object({
  name: v.string(),
  // Not everyone filled in their theme song
  themeSong: v.string().optional(),
});

person.parse({ name: "Jane Doe", themeSong: "Never gonna give you up" });
// { name: "Jane Doe", themeSong: "Never gonna give you up" }
person.parse({ name: "Jane Doe" });
// { name: "Jane Doe" }
person.parse({ name: "Jane Doe", themeSong: undefined });
// { name: "Jane Doe", themeSong: undefined }

Optionals are only used with v.object(...) and don't work as standalone parsers.

const t = v.string().optional();

// TypeScript error: Property 'parse' does not exist on type 'Optional<string>'
t.parse("Hello, World!");

The .default(...) method can be used to set a default value for a missing or undefined value.

const person = v.object({
  name: v.string(),
  // Set a sensible default for those unwilling to fill in their theme song
  themeSong: v.string().default("Tribute"),
});

person.parse({ name: "Jane Doe", themeSong: "Never gonna give you up" });
// { name: "Jane Doe", themeSong: "Never gonna give you up" }
person.parse({ name: "Jane Doe" });
// { name: "Jane Doe", themeSong: "Tribute" }
person.parse({ name: "Jane Doe", themeSong: undefined });
// { name: "Jane Doe", themeSong: "Tribute" }

Array Types

The v.array(...) combinator can be used to check that the value is an array, and that its items have a specific type. The validated arrays may be of arbitrary length, including empty arrays.

const a = v.array(v.object({ name: v.string() }));

a.parse([{ name: "Acme Inc." }, { name: "Evil Corporation" }]);
// [{ name: "Acme Inc." }, { name: "Evil Corporation" }]
a.parse([]);
// []

a.parse({ 0: { name: "Acme Inc." } });
// ValitaError: invalid_type at . (expected array)

Tuple Types

Despite JavaScript not having tuple values (...yet?), many APIs emulate them with arrays. For example, if we needed to encode a range between two numbers we might choose type Range = [number, number] as the data type. From JavaScript's point of view it's just an array but TypeScript knows about the value of each position and that the array must have two entries.

We can express this kind of type with v.tuple(...):

const range = v.tuple([v.number(), v.number()]);

range.parse([1, 2]);
// [1, 2]
range.parse([200, 2]);
// [200, 2]

range.parse([1]);
// ValitaError: invalid_length at . (expected an array with 2 item(s))
range.parse([1, 2, 3]);
// ValitaError: invalid_length at . (expected an array with 2 item(s))
range.parse([1, "2"]);
// ValitaError: invalid_type at .1 (expected number)

Union Types

A union type is a value which can have several different representations. Let's imagine we have a value of type Shape that can be either a triangle, a circle or a square:

const triangle = v.object({ type: v.literal("triangle") });
const square = v.object({ type: v.literal("square") });
const circle = v.object({ type: v.literal("circle") });

const shape = v.union(triangle, square, circle);

shape.parse({ type: "triangle" });
// { type: "triangle" }

shape.parse({ type: "heptagon" });
// ValitaError: invalid_literal at .type (expected "triangle", "square" or "circle")

Note that although in this example all representations are objects and have the shared property type, it's not necessary at all. Each representation can have completely different base type.

const primitive = v.union(v.number(), v.string(), v.boolean());

primitive.parse("Hello, World!");
// "Hello, World!"

primitive.parse({});
// ValitaError: invalid_type at . (expected number, string or boolean)

Nullable Type

When working with APIs or databases some types may be nullable. The t.nullable() shorthand returns a validator equivalent to v.union(v.null(), t).

// type name = null | string
const name = v.string().nullable();

// Passes
name.parse("Acme Inc.");
// Passes
name.parse(null);

Recursive Types

Some types can contain arbitrary nesting, like type T = string | T[]. We can express such types with .lazy(...).

Note that TypeScript can not infer return types of recursive functions. That's why v.lazy(...) validators need to be explicitly typed with v.Type<T>.

type T = string | T[];
const myType: v.Type<T> = v.lazy(() => v.union(v.string(), v.array(myType)));

Custom Validators

The .assert() method can be used for custom validation logic, like checking that object properties are internally consistent.

const Span = v
  .object({ start: v.number(), end: v.number() })
  .assert((obj) => obj.start <= obj.end);

Span.parse({ start: 1, end: 2 });
// { start: 1, end: 2 }

Span.parse({ start: 2, end: 1 });
// ValitaError: custom_error at . (validation failed)

You can also refine the input type by passing in a type predicate. Note that the type predicate must have a compatible input type.

function isEventHandlerName(s: string): s is `on${string}` {
  return s.startsWith("on");
}

const e = v.string().assert(isEventHandlerName);

const name: `on${string}` = e.parse("onscroll");
// "onscroll"

e.parse("Steven");
// ValitaError: custom_error at . (validation failed)

Each .assert(...) returns a new validator, so you can further refine already refined types. You can also pass in a custom failure messages.

const Integer = v.number().assert((n) => Number.isInteger(n), "not an integer");

const Byte = Integer.assert((i) => i >= 0 && i <= 255, "not between 0 and 255");

Byte.parse(1);
// 1

Byte.parse(1.5);
// ValitaError: custom_error at . (not an integer)
Byte.parse(300);
// ValitaError: custom_error at . (not between 0 and 255)

Custom validators can be used like any other built-in validator. This means that you can define helpers tailored to your specific use cases and reuse them over and over.

// Reusable custom validator
const Organization = v
  .object({
    name: v.string(),
    active: v.boolean(),
  })
  .assert((org) => org.active);

// Reuse the custom validator
const Transaction = v.object({
  buyer: Organization,
  seller: Organization,
  amount: v.number(),
});

Custom Parsers

While .assert(...) can ensure that a value is valid and event refine the value's type, it can't alter the value itself. Yet sometimes we may want to validate and transform the value in one go.

The .map(...) method is great for cases when you know that the transformation can't fail. The output type doesn't have to stay same:

const l = v.string().map((s) => s.length);

l.parse("Hello, World!");
// 13

l.parse(1);
// ValitaError: invalid_type at . (expected string)

The .chain(...) method is more powerful: it can also be used for cases where the parsing might fail. Imagine a JSON API which outputs dates in the YYYY-MM-DD format and we want to return a valid Date from our validation phase:

{
  "created_at": "2022-01-01"
}

.chain(...), much like map, receives a function to which it will pass the raw value as the first argument. If the transformation fails, we return an error (with an optional message) with v.err(...). If not, then we return the transformed value with v.ok(...).

const DateType = v.string().chain((s) => {
  const date = new Date(s);

  if (isNaN(+date)) {
    return v.err("invalid date");
  }

  return v.ok(date);
});

const APIResponse = v.object({
  created_at: DateType,
});

APIResponse.parse({ created_at: "2022-01-01" });
// { created_at: 2022-01-01T00:00:00.000Z }

APIResponse.parse({ created_at: "YOLO" });
// ValitaError: custom_error at .created_at (invalid date)

For both .map(...) and .chain(...) we highly recommend to avoid mutating the input value. Prefer returning a new value instead.

v.object({ name: v.string() }).map((obj) => {
  // Mutating the input value like below is highly discouraged:
  //  obj.id = randomUUID();
  // Return a new value instead:
  return { ...obj, id: randomUUID() };
});

Parsing Without Throwing

The .parse(...) method used thus far throws a ValitaError when validation or parsing fails. The .try(...) method can be used when you'd rather throw only actually exceptional cases such as coding errors. Parsing modes are also supported.

const o = v.object({ name: v.string() });

o.try({ name: "Acme Inc." });
// { ok: true, value: { name: "Acme Inc." } }
o.try({ name: "Acme Inc.", country: "Freedomland" }, { mode: "strip" });
// { ok: true, value: { name: "Acme Inc." } }

o.try({});
// { ok: false, message: "missing_value at .name (missing value)" }

The .ok property can be used to inspect the outcome in a typesafe way.

// Fail about 50% of the time
const r = o.try(Math.random() < 0.5 ? { name: "Acme Inc." } : {});

if (r.ok) {
  // r.value is defined within this block
  console.log(`Success: ${r.value}`);
} else {
  // r.message is defined within this block
  console.log(`Failure: ${r.message}`);
}

For allow further composition, .try(...)'s return values are compatible with .chain(...). The chained function also receives a second parameter that contains the parsing mode, and can be passed forward to .try(...).

const Company = v.object({ name: v.string() });

const CompanyString = v.string().chain((json, options) => {
  let value: unknown;
  try {
    value = JSON.parse(json);
  } catch {
    return v.err("not valid JSON");
  }
  return Company.try(value, options);
});

CompanyString.parse('{ "name": "Acme Inc." }');
// { name: "Acme Inc." }

CompanyString.parse('{ "name": "Acme Inc.", "ceo": "Wiley E. Coyote" }');
// ValitaError: unrecognized_keys at . (unrecognized key "ceo")

// The parsing mode is forwarded to .try(...)
CompanyString.parse('{ "name": "Acme Inc.", "ceo": "Wiley E. Coyote" }', {
  mode: "strip",
});
// { name: 'Acme Inc.' }

Inferring Output Types

The exact output type of a validator can be inferred from a type validator's using with v.Infer<typeof ...>:

const Person = v.object({
  name: v.string(),
  age: v.number().optional(),
});

type Person = v.Infer<typeof Person>;
// type Person = { name: string, age?: number };

Type Composition Tips & Tricks

Reduce, Reuse, Recycle

The API interface of this library is intentionally kept small - for some definition of small. As such we encourage curating a library of helpers tailored for your specific needs. For example a reusable helper for ensuring that a number falls between a specific range could be defined and used like this:

function between(min: number, max: number) {
  return (n: number) => {
    if (n < min || n > max) {
      return v.err("outside range");
    }
    return v.ok(n);
  };
}

const num = v.number().chain(between(0, 255));

Type Inference & Generics

Every standalone validator fits the type v.Type<Output>, Output being the validator's output type. TypeScript's generics and type inference can be used to define helpers that take in validators and do something with them. For example a readonly(...) helper that casts the output type to a readonly (non-recursively) could be defined and used as follows:

function readonly<T>(t: v.Type<T>): v.Type<Readonly<T>> {
  return t as v.Type<Readonly<T>>;
}

const User = readonly(v.object({ id: v.string() }));
type User = v.Infer<typeof User>;
// type User = { readonly id: string; }

Deconstructed Helpers

Some validator types offer additional properties and methods for introspecting and transforming them further. One such case is v.object(...)'s .shape property that contains the validators for each property.

const Company = v.object({
  name: v.string().assert((s) => s.length > 0, "empty name"),
});
Company.shape.name.parse("Acme Inc.");
// "Acme Inc."

However, because .assert(...), .map(...) and .chain(...) may all restrict and transform the output type almost arbitrarily, their returned validators may not have the properties or methods specific to the original ones. For example a refined v.object(...) validator will not have the .shape property. Therefore the following will not work:

const Company = v
  .object({
    name: v.string().assert((s) => s.length > 0, "empty name"),
    employees: v.number(),
  })
  .assert((c) => c.employees >= 0);

const Organization = v.object({
  // Try to reuse Company's handy name validator
  name: Company.shape.name,
});
// TypeScript error: Property 'shape' does not exist on type 'Type<{ name: string; }>'

The recommended solution is to deconstruct the original validators enough so that the common pieces can be directly reused:

const NonEmptyString = v.string().assert((s) => s.length > 0, "empty");

const Company = v
  .object({
    name: NonEmptyString,
    employees: v.number(),
  })
  .assert((c) => c.employees >= 0);

const Organization = v.object({
  name: NonEmptyString,
});

License

This library is licensed under the MIT license. See LICENSE.

valita's People

Contributors

dimatakoy avatar github-actions[bot] avatar jviide avatar marvinhagemeister avatar reinismu avatar rslabbert avatar townsheriff avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

valita's Issues

Runtime schema inspection / type narrowing

I've noticed that the API is reaaaally defensive. I expect this is an intentional choice, but it makes some things quite difficult. The workaround mentioned in #26 is okay (in contrast with extending Type) for creating new types, but I just hit a problem that I don't think I can get around without access to the actual ObjectType class.

What I'm trying to do is validate "as much of the data as I currently have" against a subset of the schema I've constructed. ObjectType includes useful class functions (partial, omit, pick, etc...) -- but since I'm pulling the subset from a generic, I can't access them. I also can't use instanceof to narrow the type so that TypeScript will allow me to call .partial on the schema.

Concrete details follow, but you needn't read further if you agree that it's useful to be able to write:

const unknownSchema: unknown;
if (unknownSchema instanceof v.ObjectType) {
  unknownSchema.partial();
}

The setup this time is a hierarchical config schema, where the top-level keys are the config necessary for some component, for example:

const schema = v.object({
  common: v.object({
    configFile: v.string().default('config.json')
  }),
  db: v.object({ ... }),
  web: v.object({ ... }),
})

I'm using the reasonably common pattern of a default config file location, with the ability to specify an alternate config file to use via environment variables / cli args. The path to the config file is in common as shown above. So, the application logic is something like this:

  • load CLI and env vars into the config store
  • load content from the config file into the config store
  • etc...

At the stage where I'm loading the file, I want to retrieve the contents of only the common key, and force all the values to be optional, so that validation does not fail if some are missing. However, at final config validation, they should not be optional, and the final validated config should be as tight as possible.

I'm doing this essentially by:

schema['shape']['common'].parse(configdata['common'])

... which guarantees me the shape and types of any values that do exist, but requires me to write extra-defensive code. This is fine for this stage, but not desirable all the time once the app is fully bootstrapped.

Hopefully you can agree that this is a worthwhile use-case for exposing the instances (not just the types) of some of the basic tools here (if not the AbstractType class, or the Type class, then at least the leaf nodes -- ObjectType, ArrayType, etc.). Thoughts?

(p.s. sorry for the sudden influx of issues. This library is very useful, which is why I've spent all day attempting a migration to see if it will suit :) it is very close...)

`Pick` and `Omit`

This could be a nice addition to objects. There are cases when we need to take only a part of the object.

cosnt TestObject = v.object({
    id: v.string(),
    someValue: v.string().asserts(...)
});

const input = TestObject.pick(['someValue']);

What are your thoughts on this?

TS 4094, issueTrue is private

Hi there. I'm trying this library out and quite pleased with it -- particularly the flexibility to define parsing and validation behavior. This is hampered somewhat by a couple things, which I'm filing as separate issues to keep them discreet. The first of them is:

export const foo = (val: string) => {
  if (val === 'oh noes') {
    return v.err('sad');
  }
  return v.ok(v);
};

Since the return value of v.err includes private members, TypeScript fails when doing something like this.

Example usage:

import { foo } from './common.js';

const MyObj = v.object({
  foo: v.string().chain(foo)
});

It's okay if you define it in the same file as your schema, but you can't export the helper function. I'm not sure if other things (e.g. CustomError) would also need to be exported.

v.lazy() causes error

Example:

try {
    type P = {p?: P};
    const P: v.Type<P> = v.object({
        p: v.lazy(() => P).optional(),
    })
    
    P.parse({p: {}});
} catch (e) {
    console.error(e);
}

This causes a RangeError: Maximum call stack size exceeded

It gets stuck in an infinite recursion calling genFunc() on itself

Annotation support?

One more:

I have a function that accepts a schema in the library I'm migrating away from and integrates it with yargs. To do that, it examines the expected input type of the schema. I see that Valita does expose a name property on its schema objects, but that property winds up being useless if you use anything but the bare types exposed by valita (chain, assert, optional, etc...).

It would be nice to be able to associate some user-supplied context with a type, or at least allow .chain an argument to set the name of a type. Preferably, the orthogonal concerns of "what it is" and "how to treat it" (optional string) are both exposed, and while I'm not eager to tack something like .annotate({type: 'string', optional: true}) everywhere, it does make a simple building block with which more robust behavior can be built.

In general, introspection behavior would be helpful (the ability to traverse and understand the input requirements of a type at runtime)

Issue with generics

One thing I would like to do is create a generic parsing method for network requests. It would take in shape and return Infer. Currently it doesn't work fully.

const fetchAndValidate = async <A extends Type>(url: string, responseShape: A): Promise<Infer<A>> => {
    const result = await fetch(url);
    return responseShape.parse(result); // Type 'unknown' is not assignable to type 'SomethingOutputOf<A>'
}

Am I doing something wrong or there is something off with types?

Help developer catch error early

I noticed that it's too easy to make an error when working with .check(...). What about a runtime warning or something like this when a developer tries to chain schema with .check(...)?

or... maybe disallow this chaining like you already do it with .shape prop in case of .assert()?

Helper methods

Zod and MyZod have many helper methods .int() .max(10) .pattern() .safeParse() etc.

How do you look at these?

Currently it is possible to solve this all with .assert()

TS4023 for exported type instance

In my TSConfig I set declaration to true, and then I started getting TS4023 Errors for almost every exported type.

It appears that there is some issue with the fact that ObjectType is not an export, and so typescript cannot create declarations for instances that are defined with it.

import * as v from '@badrap/valita'

export const T = v.object({a: v.string()}); // TS4023

This is my tsconfig.json:

{
    "compilerOptions": {
        "target": "esnext",
        "moduleResolution": "node",
        "outDir": "dist",
        "strict": true,
        "noImplicitAny": true,
        "declaration": true,
    },
    "include": ["src/**/*"]
}

Tuple type

I'm very excited about this library. It has speed and everything I need except tuple :/

Any plans to add it?

Thanks!

Can t.Infer be made to _not_ unroll deep schemas?

This is partly a question, and partly a feature request.

Usually when TypeScript sees a type that matches something you've given a name, it displays that thing's type as the name. Sometimes that's not what I want, and I have to work around it, like so:

type User = {
  name: string,
  id: number,
}

type Post = {
  user: User,
  date: Date,
}

const post: Post = {
  user: { name: 'myndzi', id: 1 },
  date: new Date()
}

type Simplify<T extends {}> = {[K in keyof T]: T[K]} & unknown;

type simplified = Simplify<typeof post>;
// type simplified = {
//     user: User;
//     date: Date;
// }

In Valita, this seems to be the default (only?) behavior of t.Infer:

import * as v from '@badrap/valita';

const User = v.object({
  name: v.string(),
  id: v.number(),
});

const Post = v.object({
  user: User,
  date: v.string()
});

type post = v.Infer<typeof Post>;
// type post = {
//   user: {
//     name: string;
//     id: number;
//   };
//   date: string;
// };

Unfortunately, this means that if you include even a single field that is moderately complex, the editor unrolls the full thing, making the "high level view" of the type in question useless -- since it eliminates most of the direct keys in favor of showing the full deep structure instead (the contents of user above).

It would be nice if t.Infer did not do this, but I'm not entirely sure where the behavior is coming from. Is Valita explicitly using something like the Simplify trick above? If so, would you be open to not doing that? (It's easy to turn a "collapsed" object into an "expanded" object, but not the other way around).

An alternative might be to export AbstractType and allow usage something like:

type User = v.Infer<typeof User>;

const Post = v.object({
  user: User as AbstractType<User>,
  date: v.string()
});

... or else, something with chain, assert, etc. that lets the user specify a type alias for a complex type.

The desired behavior using the above (trivial) example would be for the type of Post to be shown as:

type post = {
  user: User,
  date: string
}

Improved error reporting for .map()

Report refers to this diff:

image

The old version used .map(...) to convert a byte array to a custom type. If validation fails, an Error is thrown.

The new version uses .chain(...).

With map, the error message on validation failure looked like this:

Error: Could not validate EdToNd message: Error: Expected key to be 32 bytes but has 0 bytes

With chain, it looks like this:

Error: Could not validate EdToNd message: ValitaError: custom_error at .essentialData.groups.0.group.profilePicture.updated.blob.key (Expected 32-byte blob key, but found 0 bytes) (+ 2 other issues)

The error message for chain is much better, because it shows the source of the validation error in a complex object. With map, only the exception message itself is reported.

Is there a technical reason why this is the case? Or could exceptions thrown by map functions be caught and handled in a way similar to chain?

Question: How to combine records and objects

I'm not sure if this can be done with the existing APIs but I want to express something like the following TypeScript type:

type T = {
  x: number;
  [key: string]: string | number;
}

The best I could come up with is:

const schema = valita
  .record(valita.union(valita.string(), valita.number()))
  .chain(v =>
    valita.object({x: valita.number()}).try(v, {mode: 'passthrough'}),
  );

Sendable Types?

is there a way to convert a Valita type instance into a sendable type?

e.g.

const T = v.union(v.string(), v.number());
const sendT = T.sendable();

// this represents sending the type
const sentT = JSON.parse(JSON.stringify(sendT));

const U = v.fromSendable(sentT)
const u = U.parse(4);

Input & output types

Hi there,

We really wanted to use your lib over Zod (for performance reasons), but this is the only feature that we miss here on Valita :

const stringToNumber = z.string().transform((val) => val.length);

type input = z.input<typeof stringToNumber>; // string
type output = z.output<typeof stringToNumber>; // number

Is there any way to achieve this with Valita or is this feature planned ?

Regards

Passing ParseOptions to chain?

It is a nice how chain and try works together and you can create nice combinations

import * as v from '@badrap/valita';
const s1 = v.object({
  x: v.number(),
});
const s2 = v.object({
  y: v.number(),
});
const s3 = s1.chain(value => s2.try(value));
console.log(s3.parse({x: 1, y: 2})); // throws
console.log(s3.parse({x: 1, y: 2}, {mode: 'passthrough'})); // also throws

I think that chain should take the options so that the last line above could be made to work.

RFE: Add a way to infer readonly types

It would be nice to have a way to mark types (array/tuples/objects/records) as readonly.

const schemaX = valita.object({
   x: valita.number(),
});
type T = valita.Infer<typeof schemaX>;
// {x: number}

I can use the TS Readonly type but it gets inconvenient when you have nested types.

type ReadonlyT = Readonly<valita.Infer<typeof schemaX>>;
// {readonly x: number}

const schemaY = valita.object({
   y: schemaX,
});
type T2 = valita.Infer<typeof schemaY>;
// {y: {x: number}}

type ReadonlyT2 = Readonly<valita.Infer<typeof schema>>;
// {readonly y: {x: number}}

I know I can use DeepReadonly type combinators but those tends to fail pretty fast for me because my types are too deep.

I guess one problem with adding a way to flag things as readonly is that it is not clear if we also need to have a runtime check for readonly-ness?

Undefined error message in Firefox

For some reason the error message in Firefox is undefined instead of showing the path to the key where the validation error occurred. This only happens when the error is logged to console. If we inspect the error object manually we can see that all properties are present as they should be.

Reproduction: https://i6nu3.csb.app/

Screenshot 2021-05-11 at 16 48 06

Validating typed arrays

I use a lot of Uint8Array values in my objects. Is it possible to validate those?

Here's a reproducer:

import * as v from "@badrap/valita";

const BytesHolder = v.object({
  fourBytes: v.array(v.number()).assert((val) => val.length === 4)
});

console.log('Primitive array');
BytesHolder.parse({
    fourBytes: [1, 2, 3, 4],
});

console.log('Typed array');
BytesHolder.parse({
    fourBytes: new Uint8Array([1, 2, 3, 4]),
});

The second call fails.

How to express intersection types?

A quick vanilla example:

type T1<F> = {field: F}
type T2 = {label: 'stringy'}
type X = T1<string> & T2

I want to do this in valita... I expected this to work:

import {Type, Infer, object, string, literal} from '@badrap/valita';
const T1 = <FT extends Type>(F: FT) => object({field: F});
const T1s = T1(string());
const T2 = object({label: literal('stringy')});
const X = T1s.chain(T2.parse);
type X = Infer<typeof T1s> & Infer<typeof T2>;

(Note: in this example the X type is correctly inferred)

but tsc doesn't accept .chain(T2.parse) maybe there should be a .isect(T2) that just accepts the type?

External fields (for objects and tuples)

This is a feature request. I have some code that looks like this:

import * as v from '@badrap/valita';

// setup
const externalSymbol = Symbol('external') as any;
const isExternal = (v: unknown) => v === externalSymbol;
const external = <T>(t: v.Type<T>): v.Type<T> =>
    t.default(externalSymbol)
    .assert(isExternal, 'field must be empty');

// child type includes fields that are constructed by another parser
// so they must not get values from the input, but must be expected on the type
type Child = v.Infer<typeof Child>;
const Child = v.object({
    name: external(v.string()),
    color: v.string(),
});

// parent includes the mapping that assigns children their name
type Parent = v.Infer<typeof Parent>
const Parent = v.object({children: v.record(Child)})
    .map(parent => {
        const children = Object.entries(parent.children);
        for (const [childname, child] of children) {
            child.name = childname;
        }
        return parent;
    });

// results
// after parsing, example's children know their own names
const okExample = Parent.parse({
    children: {
        john: {color: 'mauve'},
        frank: {color: 'red'},
    }
});
console.log(okExample.children.john.name);

// will not parse, child defines external field
const badExample = Parent.parse({
    children: {
        john: {color: 'mauve', name: 'lucas'},
        frank: {color: 'red'},
    }
});

Parent is responsible for completing the construction of Child (because it has the necessary information). To facilitate this Child provides type information for the field, but also marks it as external, this way if the input tries to provide a value it will result in a failure to parse.

The feature request is to add the type modifier .external() to types to express this use case as follows:

type Child = v.Infer<typeof Child>;
const Child = v.object({
    name: v.string().external(),
    color: v.string(),
});

Bonus points if forgetting to assign them creates a parse error later!

Recursive Types Cannot be Defined

I might have a Typescript type like this:

type T = string | T[]

But if I try to define this type in valita I cannot pass the identifier to its own instantiator:

const T = v.union(v.string(), v.array(T)); // error: identifier used before it is defined

I mulled over this a bit, but I can't think of a smart way to implement it that doesn't involve somehow the union after it is instantiated, and then having to trick typescript into giving it a different type...

Is there any way to get all issues and not the only first one?

    const testData = {};

    const testSchema = v.object({
        test: v.string(),
        test2: v.string()
    });

    testSchema.parse(testData);

The error only speaks of one problem but actually two fields are missing:

ValitaError: missing_key at . (missing key \"test\")

Deno support

Hi guys, is there any interest to support deno?

I am already using valita with Deno and it is completely compatible. The only issue is that I have to use a bundled version and import it from a CDN like https://cdn.jsdelivr.net/gh/badrap/valita@d5d0125c467fb307addb9cc2badd098bfc66b834/src/index.ts .

Maybe you could implement webhooks pointing to the index.ts file which would publish the new releases on https://deno.land/x. automatically You can find more about it on Deno's website here.

Thanks!

Value validation

Is it possible to do validations on the value, which is not part of the type information (e.g. only allowing numbers 0-255, or only strings matching a RegEx)? Or is that out of scope?

There seems to be code to validate the length of arrays, but I think right now this is only used for tuples, which are covered by TypeScript's type system.

The `protoless` optimization breaks Firebase/Firestore

The following optimization to use {__proto__: null} as the [[Prototype]] of cloned objects

valita/src/index.ts

Lines 568 to 578 in 5db630e

// When an object matcher needs to create a copied version of the input,
// it initializes the new objects with Object.create(protoless).
//
// Using Object.create(protoless) instead of just {} makes setting
// "__proto__" key safe. Previously we set object properties with a helper
// function that special-cased "__proto__". Now we can just do
// `output[key] = value` directly.
//
// Using Object.create(protoless) instead of Object.create(null) seems to
// be faster on V8 at the time of writing this (2023-08-07).
const protoless = Object.freeze(Object.create(null));

causes, Firebase/Firestore to fail its check for "plain objects".

https://github.com/googleapis/nodejs-firestore/blob/d2b97c4e041ca6f3245b942540e793d429f8e5c5/dev/src/util.ts#L107C1-L114C2

Could we remove the protoless optimization and use null as the [[Prototype]]?

// Using Object.create(protoless) instead of Object.create(null) seems to
// be faster on V8 at the time of writing this (2023-08-07).

Do you have a perf test for this? Is there a V8 tracking bug?

Readonly fields on inferred type

Is there a way to get fields in the inferred type to be marked as readonly?

I'm thinking of something like this:

export const SCHEMA = v.object({
    regular_field: v.string(),
    optional_field: v.string().optional(),
    readonly_field: v.string().readonly(),
});
export type InferredType = v.Infer<typeof SCHEMA>;

..resulting in:

type InferredType = {
    optional_field?: string | undefined;
    regular_field: string;
    readonly readonly_field: string;
}

export type classes

After writing some mini libraries, I realized that we need to export the classes so we don't have to use type assetions on every line.

// import hierarhy that i suggest
import * as v from '@badrap/valita' // like right now
import * as valitaLib from '@badrap/valita/lib' // export classes for library authors here

Exporting AbstractType

I currently have some HTTP request functions with a signature similar to this:

private async _request<TSchema extends v.ObjectType>(
    requestPath: string,
    schema: TSchema,
): Promise<v.Infer<TSchema>> {

This is really nice, because I can do generic validation of a JSON response body. However, it only works with object types, not with union types.

The type that v.Infer requires is T extends AbstractType. However, I cannot write that into my function signature because AbstractType isn't exported. Would it be possible to export AbstractType?

Extending objects

Zod had .extend() method on most types.

I can use v.object({ ...BaseType, <Wriet my stuff>})

It works fine in all cases except when I have v.object({}).assert(...). Then I can't get shape out of it. I guess even if I would get it, I would love my assert. Any ideas on how I could improve this?

Should we add `intersect` method to UnionType?

I found a multiple cases where i do

 const a = v.union(v.literal("started"));
 const b = v.union(v.literal("completed"));

 const schema = v.union(...a.options, ...b.options);

So, can we add intersect (or, maybe with better name), to UnionType, like we did it for ObjectType before (.extend)?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.