当前位置:首页>学习笔记>多模态大模型学习笔记(十四)——transformer学习之Self-Attention

多模态大模型学习笔记(十四)——transformer学习之Self-Attention

  • 2026-03-14 14:50:30
多模态大模型学习笔记(十四)——transformer学习之Self-Attention

多模态大模型学习笔记(十四)——Transformer学习之Self-Attention

Self-Attention(自注意力机制)是Transformer架构的核心引擎,它解决了RNN类模型“长距离依赖建模困难”和“并行计算效率低”的痛点,让模型能同时捕捉序列中任意两个Token的语义关联。

本文将结合核心示意图,由浅入深地拆解Scaled Dot-Product Attention(缩放点积注意力)、Mask机制、Multi-Head Attention(多头注意力)的核心逻辑,配套修正后的数学原理与可运行代码实现,彻底吃透Self-Attention的工作机制。

点击下方卡片,关注“人工智能陈小白

视觉/大模型/图像重磅干货,第一时间送达!

1. 核心概念铺垫:Q、K、V的通俗隐喻与本质

在进入技术细节前,先理解Q、K、V的核心角色,这是掌握Self-Attention的关键。

1.1 通俗隐喻(地图-经纬度-物品)

概念
隐喻含义
核心作用
Query(Q,查询)
一张模糊的地图
代表当前Token想要“找什么”,是发起检索的“需求向量”
Key(K,键)
准确的经纬度地址
代表序列中每个Token“提供什么”,是被检索的“特征向量”
Value(V,值)
某空间内的贵重物品
代表序列中每个Token的“核心语义内容”,是最终要提取的信息

1.2 技术本质

在Transformer中,Q、K、V并非天然存在,而是通过输入嵌入向量(Token Embedding + 位置编码) 经过3个独立的可学习线性层投影得到:

其中:

  • • :输入嵌入向量(为批次大小,为序列长度,为模型隐藏层维度);
  • • :可学习的投影矩阵(为单个注意力头的维度)。

2. Scaled Dot-Product Attention:自注意力的基础单元

Scaled Dot-Product Attention(缩放点积注意力)是Self-Attention的最小可执行单元。

2.1 核心流程

按执行顺序拆解每一步的作用:

  1. 1. MatMul(Q×Kᵀ):计算Q与每个K的相似度(注意力分数),衡量当前Token与序列中其他Token的关联程度;
  2. 2. Scale(缩放):除以,解决高维向量点积导致的“梯度消失”问题;
  3. 3. Mask(可选,遮罩):对无效位置(如padding填充位、生成式任务的未来Token)赋值为负无穷,避免模型关注这些位置;
  4. 4. SoftMax:将注意力分数归一化为0~1的概率分布,总和为1,代表对每个Token的“关注权重”;
  5. 5. MatMul(权重×V):用归一化的注意力权重对V加权求和,得到融合了全局语义的当前Token表示

2.2 数学原理

(1)核心公式

Scaled Dot-Product Attention的完整数学表达式为:

各参数维度说明:

  • • :查询序列矩阵(为查询序列长度);
  • • :键序列矩阵(为键序列长度,Self-Attention中);
  • • :值序列矩阵(通常);
  • • :Mask矩阵(无效位置为,有效位置为0);
  • • 输出:(融合全局语义的查询序列表示)。

(2)为什么要“缩放”?

较大时,的点积结果方差会随线性增大,导致SoftMax输出极度趋近于0或1(梯度消失)。除以可将方差归一化为1,保证梯度稳定:

(3)Mask的两种类型

  1. 1. Padding Mask:针对不等长序列,屏蔽padding填充位:
  2. 2. Look-ahead Mask:针对生成式任务(如GPT),屏蔽“当前Token之后的所有位置”:

2.3 代码实现(PyTorch版)

import torchimport torch.nn.functional as Fdefscaled_dot_product_attention(    q: torch.Tensor,    k: torch.Tensor,    v: torch.Tensor,    mask: torch.Tensor = None) -> tuple[torch.Tensor, torch.Tensor]:"""    实现Scaled Dot-Product Attention,与数学公式严格对应    参数:        q: [batch_size, seq_len_q, d_k]  查询矩阵        k: [batch_size, seq_len_k, d_k]  键矩阵        v: [batch_size, seq_len_k, d_v]  值矩阵        mask: [batch_size, seq_len_q, seq_len_k]  Mask矩阵(可选)    返回:        output: [batch_size, seq_len_q, d_v]  注意力输出        attn_weights: [batch_size, seq_len_q, seq_len_k]  注意力权重    """# 1. 计算Q×K^T(对应公式中的QK^T)    d_k = q.size(-1)    attn_scores = torch.matmul(q, k.transpose(-2, -1))  # [B, L_q, L_k]# 2. 缩放(对应公式中的/√D_k)    attn_scores = attn_scores / torch.sqrt(torch.tensor(d_k, dtype=torch.float32))# 3. 应用Mask(对应公式中的+M)if mask isnotNone:        attn_scores = attn_scores.masked_fill(mask == 1, -1e9)  # Mask位设为-∞# 4. SoftMax归一化(对应公式中的SoftMax(·))    attn_weights = F.softmax(attn_scores, dim=-1)  # [B, L_q, L_k]# 5. 权重×V(对应公式中的SoftMax(·)V)    output = torch.matmul(attn_weights, v)  # [B, L_q, D_v]return output, attn_weights# 测试代码if __name__ == "__main__":# 模拟输入:B=2, L=5, D_k=64    batch_size, seq_len, d_k = 2564    q = torch.randn(batch_size, seq_len, d_k)    k = torch.randn(batch_size, seq_len, d_k)    v = torch.randn(batch_size, seq_len, d_k)# 模拟Padding Mask:第2个样本的后2个Token是padding    mask = torch.zeros(batch_size, seq_len, seq_len)    mask[1, :, 3:] = 1# [2,5,5]# 执行注意力计算    output, attn_weights = scaled_dot_product_attention(q, k, v, mask)print(f"Q/K/V形状: {q.shape}")print(f"注意力权重形状: {attn_weights.shape}")  # [2,5,5]print(f"注意力输出形状: {output.shape}")        # [2,5,64]

3. Multi-Head Attention:多头注意力机制

Multi-Head Attention是Scaled Dot-Product Attention的升级版本,解决了“单一注意力头无法捕捉多维度语义”的问题。

3.1 核心逻辑

多头注意力的核心思想是:将Q、K、V拆分为个独立的“注意力头”,每个头学习不同维度的语义关联,最后拼接并线性投影,融合所有头的信息

执行步骤:

  1. 1. Linear投影:输入Q、K、V分别经过独立线性层,映射到高维空间;
  2. 2. 拆分多头:将投影后的Q、K、V按维度拆分为个头;
  3. 3. 单头注意力计算:每个头独立执行Scaled Dot-Product Attention;
  4. 4. 拼接多头输出:将个头的输出按维度拼接;
  5. 5. 最终线性投影:融合多头语义信息,得到最终结果。

3.2 数学原理

(1)多头拆分与投影

假设模型隐藏层维度为,注意力头数为,则每个头的维度(必须整除):

(2)单头注意力与拼接

  • • :第个头的注意力输出;
  • • :按最后一维拼接(将维度拼接为);
  • • :最终投影矩阵,融合多头语义信息。

3.3 代码实现(PyTorch版)

import torchimport torch.nn as nnfrom typing importOptionalclassMultiHeadAttention(nn.Module):def__init__(self, d_model: int, num_heads: int):"""        实现Multi-Head Attention,与数学公式严格对应        参数:            d_model: 模型总维度(如768),需满足 d_model % num_heads == 0            num_heads: 注意力头数(如12)        """super().__init__()self.d_model = d_modelself.num_heads = num_headsself.d_k = d_model // num_heads  # 单头维度(对应公式中的D_k)# 1. 线性投影层(对应公式中的W_Q/W_K/W_V)self.w_q = nn.Linear(d_model, d_model)self.w_k = nn.Linear(d_model, d_model)self.w_v = nn.Linear(d_model, d_model)# 5. 最终投影层(对应公式中的W_O)self.w_o = nn.Linear(d_model, d_model)def_split_heads(self, x: torch.Tensor) -> torch.Tensor:"""        将投影后的向量拆分为多头(对应公式中的拆分步骤)        输入:x [B, L, D_model]        输出:x [B, num_heads, L, D_k]        """        batch_size, seq_len, _ = x.shape# 拆分:[B, L, num_heads, D_k] → 转置:[B, num_heads, L, D_k]return x.view(batch_size, seq_len, self.num_heads, self.d_k).transpose(12)def_concat_heads(self, x: torch.Tensor) -> torch.Tensor:"""        拼接多头输出(对应公式中的Concat步骤)        输入:x [B, num_heads, L, D_k]        输出:x [B, L, D_model]        """        batch_size, _, seq_len, _ = x.shape# 转置:[B, L, num_heads, D_k] → 拼接:[B, L, D_model]return x.transpose(12).contiguous().view(batch_size, seq_len, self.d_model)defforward(        self,        q: torch.Tensor,        k: torch.Tensor,        v: torch.Tensor,        mask: Optional[torch.Tensor] = None) -> torch.Tensor:"""        前向传播(与数学公式严格对应)        参数:            q/k/v: [B, L, D_model]  输入矩阵            mask: [B, L, L]  Mask矩阵(可选)        返回:            output: [B, L, D_model]  多头注意力输出        """        batch_size = q.size(0)# Step 1: 线性投影(对应公式中的Q·W_Q等)        q_proj = self.w_q(q)  # [B, L, D_model]        k_proj = self.w_k(k)  # [B, L, D_model]        v_proj = self.w_v(v)  # [B, L, D_model]# Step 2: 拆分多头(对应公式中的Q_i等)        q_heads = self._split_heads(q_proj)  # [B, h, L, D_k]        k_heads = self._split_heads(k_proj)  # [B, h, L, D_k]        v_heads = self._split_heads(v_proj)  # [B, h, L, D_k]# Step 3: 单头注意力计算(对应公式中的head_i)# 扩展Mask维度以匹配多头:[B, L, L] → [B, 1, L, L]        mask_expanded = mask.unsqueeze(1if mask isnotNoneelseNone        attn_output, _ = scaled_dot_product_attention(q_heads, k_heads, v_heads, mask_expanded)# attn_output: [B, h, L, D_k]# Step 4: 拼接多头输出(对应公式中的Concat)        attn_concat = self._concat_heads(attn_output)  # [B, L, D_model]# Step 5: 最终线性投影(对应公式中的·W_O)        output = self.w_o(attn_concat)  # [B, L, D_model]return output# 测试代码if __name__ == "__main__":# 初始化:D_model=768,h=12(BERT-base配置)    mha = MultiHeadAttention(d_model=768, num_heads=12)# 模拟输入:B=2, L=10, D_model=768(Self-Attention中Q=K=V)    batch_size, seq_len, d_model = 210768    x = torch.randn(batch_size, seq_len, d_model)# 模拟Look-ahead Mask(生成式任务)    look_ahead_mask = torch.triu(torch.ones(seq_len, seq_len), diagonal=1)  # [10,10]    look_ahead_mask = look_ahead_mask.unsqueeze(0).repeat(batch_size, 11)  # [2,10,10]# 执行多头注意力    output = mha(x, x, x, mask=look_ahead_mask)print(f"输入形状: {x.shape}")print(f"多头注意力输出形状: {output.shape}")  # [2,10,768](与输入维度一致)

4. Self-Attention vs 普通Attention:关键区别

Self-Attention是Attention机制的一个特例,其核心公式为:

与普通Attention的区别:

  • • 普通Attention(如机器翻译的Encoder-Decoder Attention):Q来自Decoder,K、V来自Encoder,用于“目标序列对齐源序列”;
  • • Self-AttentionQ=K=V,均来自同一序列(如Encoder的输入),用于“序列内部Token之间的语义关联建模”。

这也是为什么Self-Attention能高效捕捉长文本的上下文依赖——它能同时计算序列中任意两个Token的注意力权重,无需像RNN那样逐词遍历。


5. 总结

  1. 1. 基础单元:Scaled Dot-Product Attention通过“Q×Kᵀ相似度计算→缩放→Mask→SoftMax归一化→加权求和V”,实现单个Token的全局语义融合;
  2. 2. 升级版本:Multi-Head Attention通过“拆分多头→独立注意力计算→拼接→线性投影”,捕捉多维度语义关联,是Transformer的核心;
  3. 3. 核心优势:并行计算效率高、长距离依赖建模能力强,是大模型处理文本、图像等序列数据的基础。

—THE END—

欢迎加入人工智能交流群,共同进步。

最新文章

随机文章

基本 文件 流程 错误 SQL 调试
  1. 请求信息 : 2026-03-15 15:19:44 HTTP/2.0 GET : https://67808.cn/a/474139.html
  2. 运行时间 : 0.089056s [ 吞吐率:11.23req/s ] 内存消耗:4,642.16kb 文件加载:140
  3. 缓存信息 : 0 reads,0 writes
  4. 会话信息 : SESSION_ID=432bd0ad19f43042d408dc6b25f9cc27
  1. /yingpanguazai/ssd/ssd1/www/no.67808.cn/public/index.php ( 0.79 KB )
  2. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/autoload.php ( 0.17 KB )
  3. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/composer/autoload_real.php ( 2.49 KB )
  4. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/composer/platform_check.php ( 0.90 KB )
  5. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/composer/ClassLoader.php ( 14.03 KB )
  6. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/composer/autoload_static.php ( 4.90 KB )
  7. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/think-helper/src/helper.php ( 8.34 KB )
  8. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/think-validate/src/helper.php ( 2.19 KB )
  9. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/think-orm/src/helper.php ( 1.47 KB )
  10. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/think-orm/stubs/load_stubs.php ( 0.16 KB )
  11. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/framework/src/think/Exception.php ( 1.69 KB )
  12. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/think-container/src/Facade.php ( 2.71 KB )
  13. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/symfony/deprecation-contracts/function.php ( 0.99 KB )
  14. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/symfony/polyfill-mbstring/bootstrap.php ( 8.26 KB )
  15. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/symfony/polyfill-mbstring/bootstrap80.php ( 9.78 KB )
  16. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/symfony/var-dumper/Resources/functions/dump.php ( 1.49 KB )
  17. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/think-dumper/src/helper.php ( 0.18 KB )
  18. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/symfony/var-dumper/VarDumper.php ( 4.30 KB )
  19. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/framework/src/think/App.php ( 15.30 KB )
  20. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/think-container/src/Container.php ( 15.76 KB )
  21. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/psr/container/src/ContainerInterface.php ( 1.02 KB )
  22. /yingpanguazai/ssd/ssd1/www/no.67808.cn/app/provider.php ( 0.19 KB )
  23. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/framework/src/think/Http.php ( 6.04 KB )
  24. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/think-helper/src/helper/Str.php ( 7.29 KB )
  25. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/framework/src/think/Env.php ( 4.68 KB )
  26. /yingpanguazai/ssd/ssd1/www/no.67808.cn/app/common.php ( 0.03 KB )
  27. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/framework/src/helper.php ( 18.78 KB )
  28. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/framework/src/think/Config.php ( 5.54 KB )
  29. /yingpanguazai/ssd/ssd1/www/no.67808.cn/config/app.php ( 0.95 KB )
  30. /yingpanguazai/ssd/ssd1/www/no.67808.cn/config/cache.php ( 0.78 KB )
  31. /yingpanguazai/ssd/ssd1/www/no.67808.cn/config/console.php ( 0.23 KB )
  32. /yingpanguazai/ssd/ssd1/www/no.67808.cn/config/cookie.php ( 0.56 KB )
  33. /yingpanguazai/ssd/ssd1/www/no.67808.cn/config/database.php ( 2.48 KB )
  34. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/framework/src/think/facade/Env.php ( 1.67 KB )
  35. /yingpanguazai/ssd/ssd1/www/no.67808.cn/config/filesystem.php ( 0.61 KB )
  36. /yingpanguazai/ssd/ssd1/www/no.67808.cn/config/lang.php ( 0.91 KB )
  37. /yingpanguazai/ssd/ssd1/www/no.67808.cn/config/log.php ( 1.35 KB )
  38. /yingpanguazai/ssd/ssd1/www/no.67808.cn/config/middleware.php ( 0.19 KB )
  39. /yingpanguazai/ssd/ssd1/www/no.67808.cn/config/route.php ( 1.89 KB )
  40. /yingpanguazai/ssd/ssd1/www/no.67808.cn/config/session.php ( 0.57 KB )
  41. /yingpanguazai/ssd/ssd1/www/no.67808.cn/config/trace.php ( 0.34 KB )
  42. /yingpanguazai/ssd/ssd1/www/no.67808.cn/config/view.php ( 0.82 KB )
  43. /yingpanguazai/ssd/ssd1/www/no.67808.cn/app/event.php ( 0.25 KB )
  44. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/framework/src/think/Event.php ( 7.67 KB )
  45. /yingpanguazai/ssd/ssd1/www/no.67808.cn/app/service.php ( 0.13 KB )
  46. /yingpanguazai/ssd/ssd1/www/no.67808.cn/app/AppService.php ( 0.26 KB )
  47. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/framework/src/think/Service.php ( 1.64 KB )
  48. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/framework/src/think/Lang.php ( 7.35 KB )
  49. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/framework/src/lang/zh-cn.php ( 13.70 KB )
  50. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/framework/src/think/initializer/Error.php ( 3.31 KB )
  51. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/framework/src/think/initializer/RegisterService.php ( 1.33 KB )
  52. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/services.php ( 0.14 KB )
  53. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/framework/src/think/service/PaginatorService.php ( 1.52 KB )
  54. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/framework/src/think/service/ValidateService.php ( 0.99 KB )
  55. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/framework/src/think/service/ModelService.php ( 2.04 KB )
  56. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/think-trace/src/Service.php ( 0.77 KB )
  57. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/framework/src/think/Middleware.php ( 6.72 KB )
  58. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/framework/src/think/initializer/BootService.php ( 0.77 KB )
  59. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/think-orm/src/Paginator.php ( 11.86 KB )
  60. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/think-validate/src/Validate.php ( 63.20 KB )
  61. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/think-orm/src/Model.php ( 23.55 KB )
  62. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/think-orm/src/model/concern/Attribute.php ( 21.05 KB )
  63. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/think-orm/src/model/concern/AutoWriteData.php ( 4.21 KB )
  64. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/think-orm/src/model/concern/Conversion.php ( 6.44 KB )
  65. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/think-orm/src/model/concern/DbConnect.php ( 5.16 KB )
  66. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/think-orm/src/model/concern/ModelEvent.php ( 2.33 KB )
  67. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/think-orm/src/model/concern/RelationShip.php ( 28.29 KB )
  68. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/think-helper/src/contract/Arrayable.php ( 0.09 KB )
  69. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/think-helper/src/contract/Jsonable.php ( 0.13 KB )
  70. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/think-orm/src/model/contract/Modelable.php ( 0.09 KB )
  71. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/framework/src/think/Db.php ( 2.88 KB )
  72. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/think-orm/src/DbManager.php ( 8.52 KB )
  73. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/framework/src/think/Log.php ( 6.28 KB )
  74. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/framework/src/think/Manager.php ( 3.92 KB )
  75. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/psr/log/src/LoggerTrait.php ( 2.69 KB )
  76. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/psr/log/src/LoggerInterface.php ( 2.71 KB )
  77. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/framework/src/think/Cache.php ( 4.92 KB )
  78. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/psr/simple-cache/src/CacheInterface.php ( 4.71 KB )
  79. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/think-helper/src/helper/Arr.php ( 16.63 KB )
  80. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/framework/src/think/cache/driver/File.php ( 7.84 KB )
  81. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/framework/src/think/cache/Driver.php ( 9.03 KB )
  82. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/framework/src/think/contract/CacheHandlerInterface.php ( 1.99 KB )
  83. /yingpanguazai/ssd/ssd1/www/no.67808.cn/app/Request.php ( 0.09 KB )
  84. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/framework/src/think/Request.php ( 55.78 KB )
  85. /yingpanguazai/ssd/ssd1/www/no.67808.cn/app/middleware.php ( 0.25 KB )
  86. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/framework/src/think/Pipeline.php ( 2.61 KB )
  87. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/think-trace/src/TraceDebug.php ( 3.40 KB )
  88. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/framework/src/think/middleware/SessionInit.php ( 1.94 KB )
  89. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/framework/src/think/Session.php ( 1.80 KB )
  90. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/framework/src/think/session/driver/File.php ( 6.27 KB )
  91. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/framework/src/think/contract/SessionHandlerInterface.php ( 0.87 KB )
  92. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/framework/src/think/session/Store.php ( 7.12 KB )
  93. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/framework/src/think/Route.php ( 23.73 KB )
  94. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/framework/src/think/route/RuleName.php ( 5.75 KB )
  95. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/framework/src/think/route/Domain.php ( 2.53 KB )
  96. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/framework/src/think/route/RuleGroup.php ( 22.43 KB )
  97. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/framework/src/think/route/Rule.php ( 26.95 KB )
  98. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/framework/src/think/route/RuleItem.php ( 9.78 KB )
  99. /yingpanguazai/ssd/ssd1/www/no.67808.cn/route/app.php ( 1.72 KB )
  100. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/framework/src/think/facade/Route.php ( 4.70 KB )
  101. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/framework/src/think/route/dispatch/Controller.php ( 4.74 KB )
  102. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/framework/src/think/route/Dispatch.php ( 10.44 KB )
  103. /yingpanguazai/ssd/ssd1/www/no.67808.cn/app/controller/Index.php ( 4.81 KB )
  104. /yingpanguazai/ssd/ssd1/www/no.67808.cn/app/BaseController.php ( 2.05 KB )
  105. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/think-orm/src/facade/Db.php ( 0.93 KB )
  106. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/think-orm/src/db/connector/Mysql.php ( 5.44 KB )
  107. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/think-orm/src/db/PDOConnection.php ( 52.47 KB )
  108. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/think-orm/src/db/Connection.php ( 8.39 KB )
  109. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/think-orm/src/db/ConnectionInterface.php ( 4.57 KB )
  110. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/think-orm/src/db/builder/Mysql.php ( 16.58 KB )
  111. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/think-orm/src/db/Builder.php ( 24.06 KB )
  112. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/think-orm/src/db/BaseBuilder.php ( 27.50 KB )
  113. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/think-orm/src/db/Query.php ( 15.71 KB )
  114. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/think-orm/src/db/BaseQuery.php ( 45.13 KB )
  115. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/think-orm/src/db/concern/TimeFieldQuery.php ( 7.43 KB )
  116. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/think-orm/src/db/concern/AggregateQuery.php ( 3.26 KB )
  117. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/think-orm/src/db/concern/ModelRelationQuery.php ( 20.07 KB )
  118. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/think-orm/src/db/concern/ParamsBind.php ( 3.66 KB )
  119. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/think-orm/src/db/concern/ResultOperation.php ( 7.01 KB )
  120. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/think-orm/src/db/concern/WhereQuery.php ( 19.37 KB )
  121. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/think-orm/src/db/concern/JoinAndViewQuery.php ( 7.11 KB )
  122. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/think-orm/src/db/concern/TableFieldInfo.php ( 2.63 KB )
  123. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/think-orm/src/db/concern/Transaction.php ( 2.77 KB )
  124. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/framework/src/think/log/driver/File.php ( 5.96 KB )
  125. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/framework/src/think/contract/LogHandlerInterface.php ( 0.86 KB )
  126. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/framework/src/think/log/Channel.php ( 3.89 KB )
  127. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/framework/src/think/event/LogRecord.php ( 1.02 KB )
  128. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/think-helper/src/Collection.php ( 16.47 KB )
  129. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/framework/src/think/facade/View.php ( 1.70 KB )
  130. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/framework/src/think/View.php ( 4.39 KB )
  131. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/framework/src/think/Response.php ( 8.81 KB )
  132. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/framework/src/think/response/View.php ( 3.29 KB )
  133. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/framework/src/think/Cookie.php ( 6.06 KB )
  134. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/think-view/src/Think.php ( 8.38 KB )
  135. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/framework/src/think/contract/TemplateHandlerInterface.php ( 1.60 KB )
  136. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/think-template/src/Template.php ( 46.61 KB )
  137. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/think-template/src/template/driver/File.php ( 2.41 KB )
  138. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/think-template/src/template/contract/DriverInterface.php ( 0.86 KB )
  139. /yingpanguazai/ssd/ssd1/www/no.67808.cn/runtime/temp/6df755f970a38e704c5414acbc6e8bcd.php ( 12.06 KB )
  140. /yingpanguazai/ssd/ssd1/www/no.67808.cn/vendor/topthink/think-trace/src/Html.php ( 4.42 KB )
  1. CONNECT:[ UseTime:0.000518s ] mysql:host=127.0.0.1;port=3306;dbname=no_67808;charset=utf8mb4
  2. SHOW FULL COLUMNS FROM `fenlei` [ RunTime:0.000722s ]
  3. SELECT * FROM `fenlei` WHERE `fid` = 0 [ RunTime:0.000278s ]
  4. SELECT * FROM `fenlei` WHERE `fid` = 63 [ RunTime:0.000310s ]
  5. SHOW FULL COLUMNS FROM `set` [ RunTime:0.000579s ]
  6. SELECT * FROM `set` [ RunTime:0.000210s ]
  7. SHOW FULL COLUMNS FROM `article` [ RunTime:0.000516s ]
  8. SELECT * FROM `article` WHERE `id` = 474139 LIMIT 1 [ RunTime:0.000685s ]
  9. UPDATE `article` SET `lasttime` = 1773559184 WHERE `id` = 474139 [ RunTime:0.005839s ]
  10. SELECT * FROM `fenlei` WHERE `id` = 65 LIMIT 1 [ RunTime:0.000257s ]
  11. SELECT * FROM `article` WHERE `id` < 474139 ORDER BY `id` DESC LIMIT 1 [ RunTime:0.000515s ]
  12. SELECT * FROM `article` WHERE `id` > 474139 ORDER BY `id` ASC LIMIT 1 [ RunTime:0.000386s ]
  13. SELECT * FROM `article` WHERE `id` < 474139 ORDER BY `id` DESC LIMIT 10 [ RunTime:0.006117s ]
  14. SELECT * FROM `article` WHERE `id` < 474139 ORDER BY `id` DESC LIMIT 10,10 [ RunTime:0.002209s ]
  15. SELECT * FROM `article` WHERE `id` < 474139 ORDER BY `id` DESC LIMIT 20,10 [ RunTime:0.002102s ]
0.090778s