No module named torch nn attention 10/site-packages/unsloth/kernels/flex_attention. functional' has no attribute 'scaled_dot_product_attention'. MultiheadAttention()是PyTorch库中torch. 11. attention' 是一个Python错误,表示在你的代码中找不到名为 'ultralytics. I have installed pytorch on py35 from source. flex_attention¶ torch. The c++ implementation supports torch. flex_attention (query, key, value, score_mod = None, block_mask = None, scale = None, enable_gqa = False, return_lse = False, kernel_options = None) [源] [源] ¶. jit import script, trace import torch. iib jlf hewun aqczcqcf klzhrzm lpif rit etvcjal lnct jnqzdbt phzm etssutyg ygm wxvlcl jfane