AttributeError: 'SinusoidalEmbeddings' object has no attribute 'apply_rotary_pos_emb'

#1
by Sumail - opened

Hey,Marcus! I follow your demo, pip install the environment as you provided, when I ran the code, it shows that:
what can i do to deploy your model?

/usr/local/lib/python3.10/dist-packages/huggingface_hub/file_download.py:1132: FutureWarning: resume_download is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use force_download=True.
warnings.warn(
loading saved model at version 1.2.17, but current package version is 1.5.4
90%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‰ | 1344/1500 [00:11<00:01, 113.50it/s]

AttributeError Traceback (most recent call last)
Cell In[1], line 13
10 transformer = MeshTransformer.from_pretrained("MarcusLoren/MeshGPT-preview",cache_dir=cache_dir).to(device)
12 output = []
---> 13 output.append((transformer.generate(texts = ['sofa','bed', 'computer screen', 'bench', 'chair', 'table' ] , temperature = 0.0) ))
14 output.append((transformer.generate(texts = ['milk carton', 'door', 'shovel', 'heart', 'trash can', 'ladder'], temperature = 0.0) ))
15 output.append((transformer.generate(texts = ['hammer', 'pedestal', 'pickaxe', 'wooden cross', 'coffee bean', 'crowbar'], temperature = 0.0) ))

File /usr/local/lib/python3.10/dist-packages/x_transformers/autoregressive_wrapper.py:29, in eval_decorator..inner(self, *args, **kwargs)
27 was_training = self.training
28 self.eval()
---> 29 out = fn(self, *args, **kwargs)
30 self.train(was_training)
31 return out

File /usr/local/lib/python3.10/dist-packages/torch/utils/_contextlib.py:115, in context_decorator..decorate_context(*args, **kwargs)
112 @functools.wraps(func)
113 def decorate_context(*args, **kwargs):
114 with ctx_factory():
--> 115 return func(*args, **kwargs)

File /usr/local/lib/python3.10/dist-packages/meshgpt_pytorch/meshgpt_pytorch.py:1410, in MeshTransformer.generate(self, prompt, batch_size, filter_logits_fn, filter_kwargs, temperature, return_codes, texts, text_embeds, cond_scale, cache_kv, max_seq_len, face_coords_to_file)
1407 return codes
1409 self.autoencoder.eval()
-> 1410 face_coords, face_mask = self.autoencoder.decode_from_codes_to_faces(codes)
1412 if not exists(face_coords_to_file):
1413 return face_coords, face_mask

File /usr/local/lib/python3.10/dist-packages/torch/utils/_contextlib.py:115, in context_decorator..decorate_context(*args, **kwargs)
112 @functools.wraps(func)
113 def decorate_context(*args, **kwargs):
114 with ctx_factory():
--> 115 return func(*args, **kwargs)

File /usr/local/lib/python3.10/dist-packages/meshgpt_pytorch/meshgpt_pytorch.py:910, in MeshAutoencoder.decode_from_codes_to_faces(self, codes, face_mask, return_discrete_codes)
907 quantized = self.quantizer.get_output_from_indices(codes)
908 quantized = rearrange(quantized, 'b (nf nvf) d -> b nf (nvf d)', nvf = self.num_vertices_per_face)
--> 910 decoded = self.decode(
911 quantized,
912 face_mask = face_mask
913 )
915 decoded = decoded.masked_fill(~face_mask[..., None], 0.)
916 pred_face_coords = self.to_coor_logits(decoded)

File /usr/local/lib/python3.10/dist-packages/meshgpt_pytorch/meshgpt_pytorch.py:876, in MeshAutoencoder.decode(self, quantized, face_mask)
873 if exists(linear_attn):
874 x = linear_attn(x, mask = face_mask) + x
--> 876 x = attn(x, mask = face_mask) + x
877 x = ff(x) + x
879 x = rearrange(x, 'b n d -> b d n')

File /usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py:1518, in Module._wrapped_call_impl(self, *args, **kwargs)
1516 return self._compiled_call_impl(*args, **kwargs) # type: ignore[misc]
1517 else:
-> 1518 return self._call_impl(*args, **kwargs)

File /usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py:1527, in Module._call_impl(self, *args, **kwargs)
1522 # If we don't have any hooks, we want to skip the rest of the logic in
1523 # this function, and just call forward.
1524 if not (self._backward_hooks or self._backward_pre_hooks or self._forward_hooks or self._forward_pre_hooks
1525 or _global_backward_pre_hooks or _global_backward_hooks
1526 or _global_forward_hooks or _global_forward_pre_hooks):
-> 1527 return forward_call(*args, **kwargs)
1529 try:
1530 result = None

File /usr/local/lib/python3.10/dist-packages/local_attention/transformer.py:152, in LocalMHA.forward(self, x, mask, attn_bias, cache, return_cache)
149 out = einsum(attn, v, 'b h i j, b h j d -> b h i d')
151 else:
--> 152 out = self.attn_fn(q, k, v, mask = mask, attn_bias = attn_bias)
154 if return_cache:
155 kv = torch.stack((k, v))

File /usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py:1518, in Module._wrapped_call_impl(self, *args, **kwargs)
1516 return self._compiled_call_impl(*args, **kwargs) # type: ignore[misc]
1517 else:
-> 1518 return self._call_impl(*args, **kwargs)

File /usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py:1527, in Module._call_impl(self, *args, **kwargs)
1522 # If we don't have any hooks, we want to skip the rest of the logic in
1523 # this function, and just call forward.
1524 if not (self._backward_hooks or self._backward_pre_hooks or self._forward_hooks or self._forward_pre_hooks
1525 or _global_backward_pre_hooks or _global_backward_hooks
1526 or _global_forward_hooks or _global_forward_pre_hooks):
-> 1527 return forward_call(*args, **kwargs)
1529 try:
1530 result = None

File /usr/local/lib/python3.10/dist-packages/local_attention/local_attention.py:162, in LocalAttention.forward(self, q, k, v, mask, input_mask, attn_bias, window_size)
160 if exists(self.rel_pos):
161 pos_emb, xpos_scale = self.rel_pos(bk)
--> 162 bq, bk = self.rel_pos.apply_rotary_pos_emb(bq, bk, pos_emb, scale = xpos_scale)
164 # calculate positions for masking
166 bq_t = b_t

File /usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py:1695, in Module.getattr(self, name)
1693 if name in modules:
1694 return modules[name]
-> 1695 raise AttributeError(f"'{type(self).name}' object has no attribute '{name}'")

AttributeError: 'SinusoidalEmbeddings' object has no attribute 'apply_rotary_pos_emb'

Hey,Marcus! I follow your demo, pip install the environment as you provided, when I ran the code, it shows that:
what can i do to deploy your model?

/usr/local/lib/python3.10/dist-packages/huggingface_hub/file_download.py:1132: FutureWarning: resume_download is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True
File /usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py:1695, in Module.getattr(self, name)
1693 if name in modules:
1694 return modules[name]
-> 1695 raise AttributeError(f"'{type(self).name}' object has no attribute '{name}'")
AttributeError: 'SinusoidalEmbeddings' object has no attribute 'apply_rotary_pos_emb'

Hi,

The was a library that was updated and became incompatible for MeshGPT.
But the issue with the library has been resolved, try reinstall it:

pip uninstall meshgpt_pytorch -y
pip install git+https://github.com/MarcusLoppe/meshgpt-pytorch.git

Hey,Marcus! I follow your demo, pip install the environment as you provided, when I ran the code, it shows that:
what can i do to deploy your model?

/usr/local/lib/python3.10/dist-packages/huggingface_hub/file_download.py:1132: FutureWarning: resume_download is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True
File /usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py:1695, in Module.getattr(self, name)
1693 if name in modules:
1694 return modules[name]
-> 1695 raise AttributeError(f"'{type(self).name}' object has no attribute '{name}'")
AttributeError: 'SinusoidalEmbeddings' object has no attribute 'apply_rotary_pos_emb'

Hi,

The was a library that was updated and became incompatible for MeshGPT.
But the issue with the library has been resolved, try reinstall it:

pip uninstall meshgpt_pytorch -y
pip install git+https://github.com/MarcusLoppe/meshgpt-pytorch.git

I did as you instructed, but it seems did not make a difference.
the issue still remains, so weird.. dude

loading saved model at version 1.2.17, but current package version is 1.5.7
77%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‹ | 1158/1500 [00:10<00:03, 110.12it/s]

....

File /usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py:1695, in Module.getattr(self, name)
1693 if name in modules:
1694 return modules[name]
-> 1695 raise AttributeError(f"'{type(self).name}' object has no attribute '{name}'")

AttributeError: 'SinusoidalEmbeddings' object has no attribute 'apply_rotary_pos_emb'

Hey,Marcus! I follow your demo, pip install the environment as you provided, when I ran the code, it shows that:
what can i do to deploy your model?

/usr/local/lib/python3.10/dist-packages/huggingface_hub/file_download.py:1132: FutureWarning: resume_download is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True
File /usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py:1695, in Module.getattr(self, name)
1693 if name in modules:
1694 return modules[name]
-> 1695 raise AttributeError(f"'{type(self).name}' object has no attribute '{name}'")
AttributeError: 'SinusoidalEmbeddings' object has no attribute 'apply_rotary_pos_emb'

Hi,

The was a library that was updated and became incompatible for MeshGPT.
But the issue with the library has been resolved, try reinstall it:

pip uninstall meshgpt_pytorch -y
pip install git+https://github.com/MarcusLoppe/meshgpt-pytorch.git

I did as you instructed, but it seems did not make a difference.
the issue still remains, so weird.. dude

loading saved model at version 1.2.17, but current package version is 1.5.7
77%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‹ | 1158/1500 [00:10<00:03, 110.12it/s]

....

File /usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py:1695, in Module.getattr(self, name)
1693 if name in modules:
1694 return modules[name]
-> 1695 raise AttributeError(f"'{type(self).name}' object has no attribute '{name}'")

AttributeError: 'SinusoidalEmbeddings' object has no attribute 'apply_rotary_pos_emb'

Hmm, maybe try --upgrade, that should ensure that the dependencies are reinstalled.

pip install --upgrade git+https://github.com/MarcusLoppe/meshgpt-pytorch.git

I have the same problem with the demo, has this been resolved?

I have the same problem with the demo, has this been resolved?

Unfortunately, this doesn't seem to have resolved it.

I have the same problem with the demo, has this been resolved?

Unfortunately, this doesn't seem to have resolved it.

I've updated the setup.py so it includes the required version for rotary_embedding_torch.
The issue is due to that you have a old version of rotary_embedding_torch.

Run this code and it hopefully will be forced to install the latest version of rotary_embedding_torch.
pip install --upgrade --force-reinstall git+https://github.com/MarcusLoppe/meshgpt-pytorch.git

If that doesn't resolve it, can you run this and let me know what version you have?
pip show rotary_embedding_torch

Still not working.

Name: rotary-embedding-torch
Version: 0.6.4
Summary: Rotary Embedding - Pytorch
Home-page: https://github.com/lucidrains/rotary-embedding-torch
Author: Phil Wang
Author-email: [email protected]
License: MIT
Location: C:\Users\newky\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.12_qbz5n2kfra8p0\LocalCache\local-packages\Python312\site-packages
Requires: einops, torch
Required-by: gateloop-transformer, meshgpt-pytorch, taylor-series-linear-attention

Still not working.

Name: rotary-embedding-torch
Version: 0.6.4
Summary: Rotary Embedding - Pytorch
Home-page: https://github.com/lucidrains/rotary-embedding-torch
Author: Phil Wang
Author-email: [email protected]
License: MIT
Location: C:\Users\newky\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.12_qbz5n2kfra8p0\LocalCache\local-packages\Python312\site-packages
Requires: einops, torch
Required-by: gateloop-transformer, meshgpt-pytorch, taylor-series-linear-attention

Hi again,

I was able to replicate your issue and it seems like local-attention was at fault :/
Give this a go and I'll raise the issue to the package owner:

pip install --upgrade --force-reinstall local-attention==1.9.1

If this doesn't resolve the issue can you run:
pip show local_attention
pip show taylor_series_linear_attention

Still not working.

Name: rotary-embedding-torch
Version: 0.6.4
Summary: Rotary Embedding - Pytorch
Home-page: https://github.com/lucidrains/rotary-embedding-torch
Author: Phil Wang
Author-email: [email protected]
License: MIT
Location: C:\Users\newky\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.12_qbz5n2kfra8p0\LocalCache\local-packages\Python312\site-packages
Requires: einops, torch
Required-by: gateloop-transformer, meshgpt-pytorch, taylor-series-linear-attention

Hey,

Let me know if it's been resolved so I can close this thread.

It works, thanks

Sumail changed discussion status to closed

Sign up or log in to comment