*, defaults to 77): Input sequence length for text encoder. text_encoder_num_layers (`int`, *optional*, defaults to 6) Number of layers for transformer in text encoder. text_encoder_vocab_size (`int`, *optional*, defaults to 49408) Vocabulary size for tokenizer. text_encoder_proj_layers (`int`, *optional*, defaults to 2) Number of layers in MLP for project text queries. text_encoder_n_ctx (`int`, *optional*, defaults to 16) Number of learnable text context queries. conv_dim (`int`, *optional*, defaults to 256) Feature map dimension to map outputs from the backbone. mask_dim (`int`, *optional*, defaults to 256) Dimension for feature maps in pixel decoder. hidden_dim (`int`, *optional*, defaults to 256) Dimension for hidden states in transformer decoder. encoder_feedforward_dim (`int`, *optional*, defaults to 1024) Dimension for FFN layer in pixel decoder. norm (`str`, *optional*, defaults to `GN`) Type of normalization. encoder_layers (`int`, *optional*, defaults to 6) Number of layers in pixel decoder. decoder_layers (`int`, *optional*, defaults to 10) Number of layers in transformer decoder. use_task_norm (`bool`, *optional*, defaults to `True`) Whether to normalize the task token. num_attention_heads (`int`, *optional*, defaults to 8) Number of attention heads in transformer layers in the pixel and transformer decoders. dropout (`float`, *optional*, defaults to 0.1) Dropout probability for pixel and transformer decoders. dim_feedforward (`int`, *optional*, defaults to 2048) Dimension for FFN layer in transformer decoder. pre_norm (`bool`, *optional*, defaults to `False`) Whether to normalize hidden states before attention layers in transformer decoder. enforce_input_proj (`bool`, *optional*, defaults to `False`) Whether to project hidden states in transformer decoder. query_dec_layers (`int`, *optional*, defaults to 2) Number of layers in query transformer. common_stride (`int`, *optional*, defaults to 4) Common stride used for features in pixel decoder. Examples: ```python >>> from transformers import OneFormerConfig, OneFormerModel >>> # Initializing a OneFormer shi-labs/oneformer_ade20k_swin_tiny configuration >>> configuration = OneFormerConfig() >>> # Initializing a model (with random weights) from the shi-labs/oneformer_ade20k_swin_tiny style configuration >>> model = OneFormerModel(configuration) >>> # Accessing the model configuration >>> configuration = model.config ``` Z oneformerZ