google-research / bert-base-uncased

A language model that can fill masked tokens in sentences

Examples

quickstart

Inputs

Name: input

Outputs

Name: scores

Name: tokens


Usage

Inputs

Name: input

Description: The sentences to fill `[MASK]` tokens in.

Type: string

Shape: [N]

Name: max_tokens

Description: The maximum number of tokens to predict for each mask. Optional, defaults to 1.

Type: uint32

Shape: Scalar

Outputs

Name: tokens

Description: The predicted tokens for each input sentence. This will have shape `[N, max_tokens]`, but some cells may be empty.

Type: string

Shape: [N, *]

Name: scores

Description: The scores for each predicted token. This will have shape `[N, max_tokens]`, but some cells may have a score of zero.

Type: float32

Shape: [N, *]

Readme

This model can predict masked tokens in a sentence. For example, it might predict capital given Paris is the [MASK] of France..

See here for more details.

Get it

Copy the model URL:

https://carton.pub/google-research/bert-base-uncased/5f26d87c5d82b7c37ebf92fcb38788a063d49a64cfcf1f9d118b3b710bb88005
This model has quickstart code available for every supported language.

Homepage

https://github.com/google-research/bert

License

Apache-2.0

Tasks