重构股票监控系统:数据库架构升级与功能完善
- 重构数据访问层:引入DAO模式,支持MySQL/SQLite双数据库 - 新增数据库架构:完整的股票数据、AI分析、自选股管理表结构 - 升级AI分析服务:集成豆包大模型,支持多维度分析 - 优化API路由:分离市场数据API,提供更清晰的接口设计 - 完善项目文档:添加数据库迁移指南、新功能指南等 - 清理冗余文件:删除旧的缓存文件和无用配置 - 新增调度器:支持定时任务和数据自动更新 - 改进前端模板:简化的股票展示页面 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com>
This commit is contained in:
parent
50989ce82d
commit
569c1c8813
34
.claude/settings.local.json
Normal file
34
.claude/settings.local.json
Normal file
@ -0,0 +1,34 @@
|
||||
{
|
||||
"permissions": {
|
||||
"allow": [
|
||||
"Bash(docker-compose:*)",
|
||||
"Bash(docker run:*)",
|
||||
"Bash(docker rmi:*)",
|
||||
"WebFetch(domain:github.com)",
|
||||
"mcp__mcp-all-in-one__mcp-all-in-one-validate-mcp-config",
|
||||
"Bash(mysql:*)",
|
||||
"mcp__mcp-all-in-one__mcp-all-in-one-show-mcp-config",
|
||||
"mcp__mcp-all-in-one__mcp-all-in-one-set-mcp-config",
|
||||
"Bash(python:*)",
|
||||
"Bash(curl:*)",
|
||||
"Bash(tasklist:*)",
|
||||
"Bash(findstr:*)",
|
||||
"Bash(dir:*)",
|
||||
"mcp__chrome-devtools__take_snapshot",
|
||||
"mcp__chrome-devtools__list_console_messages",
|
||||
"mcp__chrome-devtools__navigate_page",
|
||||
"Bash(taskkill:*)",
|
||||
"Bash(netstat:*)",
|
||||
"mcp__chrome-devtools__new_page",
|
||||
"mcp__chrome-devtools__list_pages",
|
||||
"Bash(timeout:*)",
|
||||
"Bash(ping:*)",
|
||||
"mcp__chrome-devtools__click",
|
||||
"Bash(rm:*)",
|
||||
"Bash(mkdir:*)",
|
||||
"Bash(mv:*)"
|
||||
],
|
||||
"deny": [],
|
||||
"ask": []
|
||||
}
|
||||
}
|
||||
10
.dockerignore
Normal file
10
.dockerignore
Normal file
@ -0,0 +1,10 @@
|
||||
venv/
|
||||
.git/
|
||||
.gitignore
|
||||
README.md
|
||||
docs/
|
||||
.idea/
|
||||
*.png
|
||||
项目文档.txt
|
||||
stock_cache.json
|
||||
watchlist.json
|
||||
Binary file not shown.
|
Before Width: | Height: | Size: 64 KiB |
155
CLAUDE.md
Normal file
155
CLAUDE.md
Normal file
@ -0,0 +1,155 @@
|
||||
# CLAUDE.md
|
||||
|
||||
This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
|
||||
|
||||
## 常用命令
|
||||
|
||||
### 开发环境
|
||||
```bash
|
||||
# 安装依赖
|
||||
pip install -r requirements.txt
|
||||
|
||||
# 启动开发服务器
|
||||
python run.py
|
||||
|
||||
# 初始化数据库(首次使用)
|
||||
python init_database.py
|
||||
|
||||
# 数据迁移(从JSON文件迁移到数据库)
|
||||
python migrate_to_database.py
|
||||
```
|
||||
|
||||
### 生产环境部署
|
||||
```bash
|
||||
# Docker构建和运行
|
||||
docker-compose up -d
|
||||
|
||||
# 查看运行状态
|
||||
docker-compose ps
|
||||
|
||||
# 查看日志
|
||||
docker-compose logs stock-monitor
|
||||
```
|
||||
|
||||
### 测试和调试
|
||||
```bash
|
||||
# 查看API文档
|
||||
# 访问 http://localhost:8000/docs
|
||||
|
||||
# 测试数据库连接
|
||||
python -c "from app.dao import StockDAO; print('数据库连接成功')"
|
||||
|
||||
# 测试Tushare API
|
||||
python -c "import tushare as ts; ts.set_token('your_token'); print(ts.pro_api().trade_cal(exchange='SSE', start_date='20240101', end_date='20240105'))"
|
||||
```
|
||||
|
||||
## 核心架构
|
||||
|
||||
### 技术栈
|
||||
- **后端**: FastAPI + SQLAlchemy + MySQL/SQLite
|
||||
- **前端**: HTML5 + CSS3 + JavaScript + Bootstrap + ECharts
|
||||
- **数据源**: Tushare API (股票数据) + 豆包大模型 (AI分析)
|
||||
- **部署**: Docker + Uvicorn
|
||||
|
||||
### 项目结构
|
||||
```
|
||||
stock-monitor/
|
||||
├── app/ # 应用主目录
|
||||
│ ├── api/ # API路由层
|
||||
│ │ ├── stock_routes.py # 股票相关API
|
||||
│ │ └── market_routes.py # 市场数据API
|
||||
│ ├── services/ # 业务逻辑层
|
||||
│ │ ├── stock_service_db.py # 股票数据服务
|
||||
│ │ ├── ai_analysis_service_db.py # AI分析服务
|
||||
│ │ ├── market_data_service.py # 市场数据服务
|
||||
│ │ └── kline_service.py # K线数据服务
|
||||
│ ├── dao/ # 数据访问层 (DAO模式)
|
||||
│ │ ├── base_dao.py # 基础DAO类
|
||||
│ │ ├── stock_dao.py # 股票数据DAO
|
||||
│ │ ├── watchlist_dao.py # 自选股DAO
|
||||
│ │ ├── ai_analysis_dao.py # AI分析DAO
|
||||
│ │ └── config_dao.py # 配置DAO
|
||||
│ ├── models/ # 数据模型
|
||||
│ └── templates/ # HTML模板
|
||||
├── config.template.py # 配置文件模板
|
||||
├── database_schema.sql # MySQL数据库表结构
|
||||
├── init_database.py # 数据库初始化脚本
|
||||
└── run.py # 应用启动脚本
|
||||
```
|
||||
|
||||
### 数据架构
|
||||
- **MySQL生产数据库**: 包含stocks、stock_data、watchlist、ai_analysis等核心表
|
||||
- **SQLite开发数据库**: 用于本地开发和测试
|
||||
- **JSON缓存**: 历史数据使用JSON文件缓存
|
||||
- **DAO模式**: 统一的数据访问接口,便于数据库切换
|
||||
|
||||
### 核心功能模块
|
||||
|
||||
#### 1. 股票数据管理 (app/services/stock_service_db.py)
|
||||
- 从Tushare API获取实时股票数据
|
||||
- 数据缓存机制(优先从数据库读取)
|
||||
- 支持强制刷新和增量更新
|
||||
|
||||
#### 2. AI智能分析 (app/services/ai_analysis_service_db.py)
|
||||
- 集成豆包大模型进行股票分析
|
||||
- 支持多维度分析:价值投资、道德经视角、投资大师分析
|
||||
- 分析结果持久化存储
|
||||
|
||||
#### 3. 自选股管理 (app/services/watchlist_service.py)
|
||||
- 添加/删除自选股票
|
||||
- 设置市值预警范围
|
||||
- 目标市值监控
|
||||
|
||||
#### 4. 市场数据服务 (app/services/market_data_service.py)
|
||||
- 沪深指数实时行情
|
||||
- K线图表数据
|
||||
- 板块涨跌排行
|
||||
|
||||
### API设计
|
||||
- RESTful API设计
|
||||
- 自动生成API文档 (`/docs`)
|
||||
- 统一错误处理和响应格式
|
||||
- 支持CORS跨域请求
|
||||
|
||||
### 配置管理
|
||||
- **Tushare Token**: 需要在config.py中配置
|
||||
- **豆包大模型API**: 需要配置API Key和模型ID
|
||||
- **数据库配置**: 支持MySQL和SQLite切换
|
||||
- **环境变量**: 支持通过环境变量覆盖配置
|
||||
|
||||
## 重要说明
|
||||
|
||||
### 数据获取限制
|
||||
- Tushare API有调用频率限制,建议适当控制请求频率
|
||||
- 免费账户每分钟最多调用200次
|
||||
- 生产环境建议使用付费套餐
|
||||
|
||||
### AI分析成本
|
||||
- 豆包大模型按量计费,有免费额度
|
||||
- 超出免费额度后按实际使用量收费
|
||||
- 建议合理控制AI分析次数
|
||||
|
||||
### 数据免责声明
|
||||
- 项目提供的股票数据仅供参考,不构成投资建议
|
||||
- 股票投资有风险,入市需谨慎
|
||||
- AI分析结果基于公开数据和模型计算
|
||||
|
||||
### 开发建议
|
||||
- 首次使用需要先执行`python init_database.py`初始化数据库
|
||||
- 开发时可使用SQLite数据库,生产环境推荐MySQL
|
||||
- 建议监控股票数量不超过30只以保证性能
|
||||
- 定时刷新间隔建议60秒以上
|
||||
|
||||
## 常见问题
|
||||
|
||||
### Q: Tushare Token如何获取?
|
||||
A: 访问 https://tushare.pro 注册账号并获取API Token
|
||||
|
||||
### Q: 豆包大模型API如何配置?
|
||||
A: 访问火山引擎控制台创建应用,获取API Key和模型接入点ID
|
||||
|
||||
### Q: 如何切换数据库?
|
||||
A: 修改config.py中的数据库连接配置,重新执行init_database.py
|
||||
|
||||
### Q: Docker部署端口冲突?
|
||||
A: 修改docker-compose.yml中的端口映射,如将"15348:8000"改为其他端口
|
||||
348
README.md
348
README.md
@ -1,165 +1,271 @@
|
||||
# AI价值投资盯盘系统
|
||||
# 📈 股票智能监控系统
|
||||
|
||||
一个基于Python的A股智能监控和AI分析系统,集成了多维度的投资分析功能,包括传统价值投资分析、道德经智慧分析以及知名投资大师的视角分析。
|
||||
一个基于FastAPI和AI技术的实时股票监控与分析平台,提供股票行情展示、智能分析、自选股管理等功能。
|
||||
|
||||
## 系统界面预览
|
||||

|
||||

|
||||

|
||||
## 🚀 核心功能
|
||||
## ✨ 功能特性
|
||||
|
||||
### 1. 股票监控
|
||||
- 实时股票数据监控
|
||||
- 自定义目标市值设置
|
||||
- 目标市值预警(高估,低估,合理)
|
||||
- 多维度指标展示(PE、PB、ROE等价值投资指标)
|
||||
### 🎯 核心功能
|
||||
- **实时股票监控**: 支持A股市场实时行情数据
|
||||
- **智能AI分析**: 集成豆包大模型,提供专业的股票分析和投资建议
|
||||
- **自选股管理**: 添加/删除自选股票,设置市值预警范围
|
||||
- **指数行情**: 沪深指数实时行情展示
|
||||
- **多维分析**: 价值投资分析、道德经视角、投资大师分析
|
||||
|
||||
### 2. 指数行情
|
||||
- 主要指数实时行情展示
|
||||
### 📊 数据来源
|
||||
- **Tushare API**: 专业的金融数据接口,提供准确的中国股市数据
|
||||
- **实时行情**: 支持实时价格、成交量、涨跌幅等关键指标
|
||||
- **历史数据**: 提供历史K线数据用于技术分析
|
||||
|
||||
### 🤖 AI智能分析
|
||||
- 集成豆包大模型(Volces API)
|
||||
- 自动生成股票分析报告
|
||||
- 提供投资建议和风险评估
|
||||
- 支持多种分析维度:
|
||||
- **传统价值投资分析**: 财务指标、估值分析、风险评估
|
||||
- **道德经智慧分析**: 企业道德评估、可持续发展分析
|
||||
- **投资大师视角**: 巴菲特、格雷厄姆、林园等大师分析方法
|
||||
|
||||
### 3. AI智能分析
|
||||
- 基础价值投资分析
|
||||
- 财务指标分析
|
||||
- 估值分析
|
||||
- 风险评估
|
||||
- 投资建议
|
||||
- 道德经分析视角
|
||||
- 企业道德评估
|
||||
- 可持续发展分析
|
||||
- 长期投资价值判断
|
||||
- 国内外价值投资大咖分析
|
||||
- 巴菲特视角
|
||||
- 格雷厄姆视角
|
||||
- 林园视角
|
||||
- 李大霄视角
|
||||
- 段永平视角
|
||||
## 🏗️ 技术架构
|
||||
|
||||
### 4. 数据管理
|
||||
- 本地数据缓存
|
||||
- 历史数据查询
|
||||
- 分析报告导出
|
||||
- 自动数据更新
|
||||
### 后端技术栈
|
||||
- **FastAPI**: 现代、快速的Python Web框架
|
||||
- **SQLAlchemy**: ORM数据库操作
|
||||
- **SQLite**: 轻量级数据库存储
|
||||
- **Tushare**: 金融数据获取
|
||||
- **OpenAI SDK**: AI大模型接口
|
||||
|
||||
## 🛠️ 技术栈
|
||||
### 前端技术栈
|
||||
- **HTML5 + CSS3**: 现代化界面设计
|
||||
- **JavaScript**: 交互逻辑实现
|
||||
- **Bootstrap**: 响应式UI组件
|
||||
- **ECharts**: 数据可视化图表
|
||||
|
||||
### 后端
|
||||
## 🚀 快速开始
|
||||
|
||||
### 环境要求
|
||||
- Python 3.8+
|
||||
- FastAPI:高性能Web框架
|
||||
- Tushare:金融数据API
|
||||
- 豆包大模型:AI分析引擎
|
||||
- SQLite:本地数据存储
|
||||
- pip 或 conda
|
||||
|
||||
### 前端
|
||||
- Bootstrap 5:响应式UI框架
|
||||
- ECharts:数据可视化
|
||||
- jQuery:DOM操作
|
||||
- WebSocket:实时数据推送
|
||||
### 安装步骤
|
||||
|
||||
## 📦 安装部署
|
||||
|
||||
### 1. 环境准备
|
||||
1. **克隆项目**
|
||||
```bash
|
||||
# 克隆项目
|
||||
git clone https://gitee.com/zyj118/stock-monitor.git
|
||||
git clone <your-repo-url>
|
||||
cd stock-monitor
|
||||
```
|
||||
|
||||
# 创建虚拟环境
|
||||
2. **创建虚拟环境**
|
||||
```bash
|
||||
python -m venv venv
|
||||
source venv/bin/activate # Linux/Mac
|
||||
venv\\Scripts\\activate # Windows
|
||||
# Windows
|
||||
venv\Scripts\activate
|
||||
# Linux/Mac
|
||||
source venv/bin/activate
|
||||
```
|
||||
|
||||
# 安装依赖
|
||||
3. **安装依赖**
|
||||
```bash
|
||||
pip install -r requirements.txt
|
||||
```
|
||||
|
||||
### 2. API配置获取
|
||||
|
||||
#### Tushare API
|
||||
1. 访问 [Tushare官网](https://tushare.pro/register?reg=431380)
|
||||
2. 注册并获取Token
|
||||
3. 可选:充值获取更高级别权限
|
||||
|
||||
#### 豆包大模型API
|
||||
1. 访问[火山引擎豆包大模型](https://www.volcengine.com/product/doubao)
|
||||
2. 注册账号(支持个人开发者注册)
|
||||
3. 进入控制台创建应用
|
||||
4. 获取API Key和Model ID
|
||||
5. 按量付费,支持免费额度体验
|
||||
|
||||
### 3. 环境配置
|
||||
修改配置文件
|
||||
找到:app/services/ai_analysis_service.py 替换自己API的模型接入点ID,豆包大模型APIkey,base_url 。
|
||||

|
||||
```
|
||||
class AIAnalysisService:
|
||||
def __init__(self):
|
||||
# 配置OpenAI客户端连接到Volces API
|
||||
self.model = "" # Volces 模型接入点ID
|
||||
self.client = OpenAI(
|
||||
api_key = "", # 豆包大模型APIkey
|
||||
base_url = "https://ark.cn-beijing.volces.com/api/v3"
|
||||
)
|
||||
4. **配置环境**
|
||||
- 获取Tushare Token: 访问 https://tushare.pro 注册获取
|
||||
- 获取豆包大模型API: 访问火山引擎控制台创建应用
|
||||
- 修改配置文件 `app/services/ai_analysis_service.py`:
|
||||
```python
|
||||
self.model = "your_model_id" # Volces 模型接入点ID
|
||||
self.client = OpenAI(
|
||||
api_key = "your_api_key", # 豆包大模型APIkey
|
||||
base_url = "https://ark.cn-beijing.volces.com/api/v3"
|
||||
)
|
||||
```
|
||||
|
||||
5. **初始化数据库**
|
||||
```bash
|
||||
python docs/database/init_database.py
|
||||
```
|
||||
|
||||
### 4. 启动系统
|
||||
6. **启动服务**
|
||||
```bash
|
||||
python run.py
|
||||
```
|
||||
访问 http://localhost:8000 即可使用系统
|
||||
|
||||
## 🤝 联系作者
|
||||
7. **访问应用**
|
||||
- 主页: http://localhost:8000
|
||||
- API文档: http://localhost:8000/docs
|
||||
|
||||
如果您对系统有任何问题或建议,欢迎联系:
|
||||
## 📁 项目结构
|
||||
|
||||
- 微信:zyj118
|
||||
- QQ:693696817
|
||||
- Email:693696817@qq.com
|
||||
```
|
||||
stock-monitor/
|
||||
├── app/ # 应用主目录
|
||||
│ ├── __init__.py # FastAPI应用初始化
|
||||
│ ├── api/ # API路由
|
||||
│ │ ├── stock_routes.py # 股票相关API
|
||||
│ │ └── market_routes.py # 市场数据API
|
||||
│ ├── services/ # 业务逻辑层
|
||||
│ │ ├── stock_service_db.py # 股票数据服务
|
||||
│ │ ├── ai_analysis_service_db.py # AI分析服务
|
||||
│ │ ├── market_data_service.py # 市场数据服务
|
||||
│ │ └── kline_service.py # K线数据服务
|
||||
│ ├── dao/ # 数据访问层
|
||||
│ │ ├── base_dao.py # 基础DAO类
|
||||
│ │ ├── stock_dao.py # 股票数据DAO
|
||||
│ │ ├── watchlist_dao.py # 自选股DAO
|
||||
│ │ └── ai_analysis_dao.py # AI分析DAO
|
||||
│ ├── models/ # 数据模型
|
||||
│ │ └── stock.py # 股票数据模型
|
||||
│ ├── templates/ # HTML模板
|
||||
│ │ ├── index.html # 主页模板
|
||||
│ │ ├── market.html # 市场模板
|
||||
│ │ └── stocks_simple.html # 股票页面模板
|
||||
│ ├── static/ # 静态文件
|
||||
│ ├── database.py # 数据库配置
|
||||
│ ├── config.py # 应用配置
|
||||
│ └── scheduler.py # 定时任务调度器
|
||||
├── docs/ # 文档目录
|
||||
│ ├── database/ # 数据库相关文档和脚本
|
||||
│ │ ├── database_schema*.sql # 数据库表结构文件
|
||||
│ │ ├── init_database.py # 数据库初始化脚本
|
||||
│ │ └── migrate_to_database.py # 数据迁移脚本
|
||||
│ └── guides/ # 使用指南
|
||||
│ ├── DATABASE_MIGRATION_GUIDE.md # 数据迁移指南
|
||||
│ └── NEW_FEATURES_GUIDE.md # 新功能指南
|
||||
├── backup/ # 备份文件目录
|
||||
│ └── json_backup_20251124_093028/ # JSON数据备份
|
||||
├── run.py # 应用启动脚本
|
||||
├── requirements.txt # Python依赖
|
||||
├── config.template.py # 配置文件模板
|
||||
├── docker-compose.yml # Docker部署配置
|
||||
├── CLAUDE.md # Claude Code 使用指南
|
||||
└── README.md # 项目说明文档
|
||||
```
|
||||
|
||||
## 📝 使用说明
|
||||
## 📱 功能模块详解
|
||||
|
||||
### 1. 添加监控股票
|
||||
1. 在主界面输入股票代码
|
||||
2. 设置目标市值范围
|
||||
3. 点击添加即可
|
||||
### 1. 股票搜索与查看
|
||||
- 支持股票代码搜索
|
||||
- 展示实时价格、涨跌幅、成交量
|
||||
- 公司基本信息展示
|
||||
- 多维度财务指标(PE、PB、ROE等)
|
||||
|
||||
### 2. 查看AI分析
|
||||
1. 点击股票行右侧的分析按钮
|
||||
2. 选择需要的分析维度(基本面/道德经/国内外价值投资大咖)
|
||||
3. 等待AI分析结果
|
||||
### 2. 自选股管理
|
||||
- 添加关注股票到自选列表
|
||||
- 设置市值预警区间
|
||||
- 目标市值预警(高估、低估、合理)
|
||||
- 一键移除自选股票
|
||||
|
||||
### 3. 指数行情查看
|
||||
1. 点击顶部导航栏的"指数行情"
|
||||
2. 查看实时指数数据和K线图
|
||||
### 3. AI智能分析
|
||||
- **价值投资分析**: 基于财务数据的深度分析
|
||||
- **道德经视角**: 从传统文化角度分析企业价值
|
||||
- **投资大师视角**: 模拟知名投资家的分析方法
|
||||
- 实时生成专业分析报告
|
||||
|
||||
## ⚠️ 注意事项
|
||||
### 4. 市场数据
|
||||
- 沪深指数实时行情
|
||||
- 板块涨跌排行
|
||||
- 市场概况展示
|
||||
- K线图表分析
|
||||
|
||||
1. API使用限制
|
||||
- Tushare免费账号有调用频率限制
|
||||
- 豆包大模型API提供免费额度,超出后按量计费
|
||||
## 🛠️ API接口
|
||||
|
||||
2. 数据时效性
|
||||
- 行情数据实时更新
|
||||
- AI分析结果默认缓存1小时
|
||||
主要RESTful API接口:
|
||||
|
||||
3. 系统性能
|
||||
- 建议监控股票数量不超过30只
|
||||
- 定时刷新间隔建议60秒以上
|
||||
- `GET /` - 主页
|
||||
- `GET /stocks` - 股票页面
|
||||
- `GET /market` - 市场页面
|
||||
- `GET /api/stock_info/{stock_code}` - 获取股票信息
|
||||
- `GET /api/watchlist` - 获取自选股列表
|
||||
- `POST /api/add_watch` - 添加自选股
|
||||
- `DELETE /api/remove_watch/{stock_code}` - 删除自选股
|
||||
- `GET /api/index_info` - 获取指数信息
|
||||
- `GET /api/company_detail/{stock_code}` - 获取公司详情
|
||||
|
||||
4.投资有风险,入市需谨慎。此系统仅为基于大数据和AI大模型的价值投资辅助分析,并不构成任何操作建议,风险自担!
|
||||
## ⚙️ 配置说明
|
||||
|
||||
## 📄 许可证
|
||||
### API配置要求
|
||||
|
||||
MIT License
|
||||
**Tushare配置**
|
||||
- 访问 https://tushare.pro 注册账号
|
||||
- 获取API Token并填入配置文件
|
||||
- 免费账户有调用频率限制
|
||||
|
||||
**豆包大模型配置**
|
||||
- 访问火山引擎控制台开通服务
|
||||
- 获取API Key和模型接入点ID
|
||||
- 按量付费,支持免费额度体验
|
||||
|
||||
### 数据库配置
|
||||
- 默认使用SQLite数据库
|
||||
- 数据文件位置: `data/stocks.db`
|
||||
- 支持扩展到MySQL等其他数据库
|
||||
|
||||
## ⚠️ 重要提示
|
||||
|
||||
### 数据免责声明
|
||||
- 本项目提供的股票数据仅供参考,不构成投资建议
|
||||
- 股票投资有风险,入市需谨慎
|
||||
- AI分析结果基于公开数据和模型计算,请以官方数据为准
|
||||
|
||||
### API使用限制
|
||||
- Tushare API有调用频率限制
|
||||
- 豆包大模型API提供免费额度,超出后按量计费
|
||||
- 请合理控制API调用频率
|
||||
|
||||
### 系统性能建议
|
||||
- 建议监控股票数量不超过30只
|
||||
- 定时刷新间隔建议60秒以上
|
||||
- 生产环境建议使用专业服务器部署
|
||||
|
||||
## 📋 部署指南
|
||||
|
||||
### 开发环境
|
||||
```bash
|
||||
# 安装依赖
|
||||
pip install -r requirements.txt
|
||||
|
||||
# 启动开发服务器
|
||||
python run.py
|
||||
```
|
||||
|
||||
### 生产环境部署
|
||||
- 推荐使用Docker容器化部署
|
||||
- 配置Nginx反向代理
|
||||
- 使用Gunicorn作为WSGI服务器
|
||||
- 配置SSL证书确保安全访问
|
||||
|
||||
### Docker部署示例
|
||||
```bash
|
||||
# 构建镜像
|
||||
docker build -t stock-monitor .
|
||||
|
||||
# 运行容器
|
||||
docker run -d -p 8000:8000 stock-monitor
|
||||
```
|
||||
|
||||
## 🤝 贡献指南
|
||||
|
||||
1. Fork 本仓库
|
||||
2. 新建 feature_xxx 分支
|
||||
3. 提交代码
|
||||
4. 新建 Pull Request
|
||||
欢迎提交Issue和Pull Request来改进项目!
|
||||
|
||||
欢迎提交Issue和Pull Request!
|
||||
本系统欢迎任何形式的二次开发,如果您觉得该系统对您有帮助,欢迎打赏!您对作者的鼓励是更新该系统的动力!
|
||||

|
||||
### 提交规范
|
||||
- Bug修复: `fix: 修复xxx问题`
|
||||
- 新功能: `feat: 添加xxx功能`
|
||||
- 文档更新: `docs: 更新xxx文档`
|
||||
- 代码重构: `refactor: 重构xxx模块`
|
||||
|
||||
## 📞 联系方式
|
||||
|
||||
如有问题或建议,欢迎通过以下方式联系:
|
||||
|
||||
- 提交Issue: [GitHub Issues](https://github.com/your-username/stock-monitor/issues)
|
||||
- 邮箱: your-email@example.com
|
||||
|
||||
## 📄 许可证
|
||||
|
||||
本项目采用MIT许可证,详见[LICENSE](LICENSE)文件。
|
||||
|
||||
---
|
||||
|
||||
**⚡ 投资有风险,入市需谨慎!本系统仅为价值投资辅助分析工具,不构成任何投资建议,投资决策请谨慎,风险自担!**
|
||||
|
||||
**⭐ 如果这个项目对你有帮助,请给个Star支持一下!**
|
||||
@ -23,4 +23,6 @@ templates = Jinja2Templates(directory=Config.TEMPLATES_DIR)
|
||||
|
||||
# 导入路由
|
||||
from app.api import stock_routes
|
||||
app.include_router(stock_routes.router)
|
||||
from app.api import market_routes
|
||||
app.include_router(stock_routes.router)
|
||||
app.include_router(market_routes.router)
|
||||
355
app/api/market_routes.py
Normal file
355
app/api/market_routes.py
Normal file
@ -0,0 +1,355 @@
|
||||
"""
|
||||
市场数据和股票浏览API路由
|
||||
"""
|
||||
from fastapi import APIRouter, Query
|
||||
from typing import Optional, List
|
||||
from app.services.market_data_service import MarketDataService
|
||||
from app.services.kline_service import KlineService
|
||||
from app.scheduler import run_manual_task, get_scheduler_status
|
||||
from datetime import datetime
|
||||
import logging
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
router = APIRouter(prefix="/api/market")
|
||||
market_service = MarketDataService()
|
||||
kline_service = KlineService()
|
||||
|
||||
|
||||
@router.get("/stocks")
|
||||
async def get_all_stocks(
|
||||
page: int = Query(1, description="页码"),
|
||||
size: int = Query(50, description="每页数量"),
|
||||
industry: Optional[str] = Query(None, description="行业代码"),
|
||||
sector: Optional[str] = Query(None, description="概念板块代码"),
|
||||
search: Optional[str] = Query(None, description="搜索关键词")
|
||||
):
|
||||
"""获取所有股票列表,支持分页、行业筛选、概念筛选、搜索"""
|
||||
try:
|
||||
# 基础查询
|
||||
stocks = market_service._get_stock_list_from_db()
|
||||
|
||||
# 筛选
|
||||
if industry:
|
||||
stocks = [s for s in stocks if s.get('industry_code') == industry]
|
||||
|
||||
if sector:
|
||||
# 需要查询股票-板块关联表
|
||||
from app.database import DatabaseManager
|
||||
db_manager = DatabaseManager()
|
||||
with db_manager.get_connection() as conn:
|
||||
cursor = conn.cursor()
|
||||
cursor.execute("""
|
||||
SELECT stock_code FROM stock_sector_relations WHERE sector_code = %s
|
||||
""", (sector,))
|
||||
sector_stocks = {row[0] for row in cursor.fetchall()}
|
||||
cursor.close()
|
||||
stocks = [s for s in stocks if s['stock_code'] in sector_stocks]
|
||||
|
||||
if search:
|
||||
search_lower = search.lower()
|
||||
stocks = [
|
||||
s for s in stocks
|
||||
if search_lower in s['stock_name'].lower() or search_lower in s['stock_code']
|
||||
]
|
||||
|
||||
# 分页
|
||||
total = len(stocks)
|
||||
start = (page - 1) * size
|
||||
end = start + size
|
||||
page_stocks = stocks[start:end]
|
||||
|
||||
return {
|
||||
"total": total,
|
||||
"page": page,
|
||||
"size": size,
|
||||
"pages": (total + size - 1) // size,
|
||||
"data": page_stocks
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"获取股票列表失败: {e}")
|
||||
return {"error": f"获取股票列表失败: {str(e)}"}
|
||||
|
||||
|
||||
@router.get("/industries")
|
||||
async def get_industries():
|
||||
"""获取所有行业分类"""
|
||||
try:
|
||||
industries = market_service.get_industry_list()
|
||||
return {"data": industries}
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"获取行业列表失败: {e}")
|
||||
return {"error": f"获取行业列表失败: {str(e)}"}
|
||||
|
||||
|
||||
@router.get("/sectors")
|
||||
async def get_sectors():
|
||||
"""获取所有概念板块"""
|
||||
try:
|
||||
sectors = market_service.get_sector_list()
|
||||
return {"data": sectors}
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"获取概念板块失败: {e}")
|
||||
return {"error": f"获取概念板块失败: {str(e)}"}
|
||||
|
||||
|
||||
@router.get("/stocks/{stock_code}")
|
||||
async def get_stock_detail(stock_code: str):
|
||||
"""获取股票详细信息"""
|
||||
try:
|
||||
# 获取股票基础信息
|
||||
from app.database import DatabaseManager
|
||||
db_manager = DatabaseManager()
|
||||
with db_manager.get_connection() as conn:
|
||||
cursor = conn.cursor(dictionary=True)
|
||||
query = """
|
||||
SELECT s.*, i.industry_name,
|
||||
GROUP_CONCAT(DISTINCT sec.sector_name) as sector_names
|
||||
FROM stocks s
|
||||
LEFT JOIN industries i ON s.industry_code = i.industry_code
|
||||
LEFT JOIN stock_sector_relations ssr ON s.stock_code = ssr.stock_code
|
||||
LEFT JOIN sectors sec ON ssr.sector_code = sec.sector_code
|
||||
WHERE s.stock_code = %s
|
||||
GROUP BY s.stock_code
|
||||
"""
|
||||
cursor.execute(query, (stock_code,))
|
||||
stock = cursor.fetchone()
|
||||
cursor.close()
|
||||
|
||||
if not stock:
|
||||
return {"error": "股票不存在"}
|
||||
|
||||
return {"data": stock}
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"获取股票详情失败: {stock_code}, 错误: {e}")
|
||||
return {"error": f"获取股票详情失败: {str(e)}"}
|
||||
|
||||
|
||||
@router.get("/stocks/{stock_code}/kline")
|
||||
async def get_kline_data(
|
||||
stock_code: str,
|
||||
kline_type: str = Query("daily", description="K线类型: daily/weekly/monthly"),
|
||||
days: int = Query(30, description="获取天数"),
|
||||
start_date: Optional[str] = Query(None, description="开始日期 YYYYMMDD"),
|
||||
end_date: Optional[str] = Query(None, description="结束日期 YYYYMMDD")
|
||||
):
|
||||
"""获取股票K线数据"""
|
||||
try:
|
||||
# 确定时间范围
|
||||
limit = days
|
||||
if start_date and end_date:
|
||||
# 如果指定了日期范围,不限制数量
|
||||
limit = 1000
|
||||
|
||||
kline_data = kline_service.get_kline_data(
|
||||
stock_code=stock_code,
|
||||
kline_type=kline_type,
|
||||
start_date=start_date,
|
||||
end_date=end_date,
|
||||
limit=limit
|
||||
)
|
||||
|
||||
# 获取股票基本信息
|
||||
from app.services.stock_service_db import StockServiceDB
|
||||
stock_service = StockServiceDB()
|
||||
stock_info = stock_service.get_stock_info(stock_code)
|
||||
|
||||
return {
|
||||
"stock_info": stock_info,
|
||||
"kline_type": kline_type,
|
||||
"data": kline_data
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"获取K线数据失败: {stock_code}, 错误: {e}")
|
||||
return {"error": f"获取K线数据失败: {str(e)}"}
|
||||
|
||||
|
||||
@router.get("/overview")
|
||||
async def get_market_overview():
|
||||
"""获取市场概览数据"""
|
||||
try:
|
||||
overview = kline_service.get_market_overview()
|
||||
return {"data": overview}
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"获取市场概览失败: {e}")
|
||||
return {"error": f"获取市场概览失败: {str(e)}"}
|
||||
|
||||
|
||||
@router.get("/hot-stocks")
|
||||
async def get_hot_stocks(
|
||||
rank_type: str = Query("volume", description="排行榜类型: volume/amount/change"),
|
||||
limit: int = Query(20, description="返回数量")
|
||||
):
|
||||
"""获取热门股票排行榜"""
|
||||
try:
|
||||
from app.database import DatabaseManager
|
||||
db_manager = DatabaseManager()
|
||||
with db_manager.get_connection() as conn:
|
||||
cursor = conn.cursor(dictionary=True)
|
||||
|
||||
today = datetime.now().strftime('%Y-%m-%d')
|
||||
|
||||
if rank_type == "volume":
|
||||
query = """
|
||||
SELECT s.stock_code, s.stock_name, k.close_price, k.volume,
|
||||
k.change_percent, k.amount, i.industry_name
|
||||
FROM kline_data k
|
||||
JOIN stocks s ON k.stock_code = s.stock_code
|
||||
LEFT JOIN industries i ON s.industry_code = i.industry_code
|
||||
WHERE k.kline_type = 'daily' AND k.trade_date = %s
|
||||
ORDER BY k.volume DESC
|
||||
LIMIT %s
|
||||
"""
|
||||
elif rank_type == "amount":
|
||||
query = """
|
||||
SELECT s.stock_code, s.stock_name, k.close_price, k.volume,
|
||||
k.change_percent, k.amount, i.industry_name
|
||||
FROM kline_data k
|
||||
JOIN stocks s ON k.stock_code = s.stock_code
|
||||
LEFT JOIN industries i ON s.industry_code = i.industry_code
|
||||
WHERE k.kline_type = 'daily' AND k.trade_date = %s
|
||||
ORDER BY k.amount DESC
|
||||
LIMIT %s
|
||||
"""
|
||||
elif rank_type == "change":
|
||||
query = """
|
||||
SELECT s.stock_code, s.stock_name, k.close_price, k.volume,
|
||||
k.change_percent, k.amount, i.industry_name
|
||||
FROM kline_data k
|
||||
JOIN stocks s ON k.stock_code = s.stock_code
|
||||
LEFT JOIN industries i ON s.industry_code = i.industry_code
|
||||
WHERE k.kline_type = 'daily' AND k.trade_date = %s AND k.change_percent IS NOT NULL
|
||||
ORDER BY k.change_percent DESC
|
||||
LIMIT %s
|
||||
"""
|
||||
else:
|
||||
return {"error": "不支持的排行榜类型"}
|
||||
|
||||
cursor.execute(query, (today, limit))
|
||||
stocks = cursor.fetchall()
|
||||
cursor.close()
|
||||
|
||||
return {"data": stocks, "rank_type": rank_type}
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"获取热门股票失败: {e}")
|
||||
return {"error": f"获取热门股票失败: {str(e)}"}
|
||||
|
||||
|
||||
@router.post("/tasks/{task_name}")
|
||||
async def run_manual_task(task_name: str):
|
||||
"""手动执行定时任务"""
|
||||
try:
|
||||
result = run_manual_task(task_name)
|
||||
return {"data": result}
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"手动执行任务失败: {task_name}, 错误: {e}")
|
||||
return {"error": f"手动执行任务失败: {str(e)}"}
|
||||
|
||||
|
||||
@router.get("/tasks/status")
|
||||
async def get_task_status(
|
||||
task_type: Optional[str] = Query(None, description="任务类型"),
|
||||
days: int = Query(7, description="查询天数")
|
||||
):
|
||||
"""获取任务执行状态"""
|
||||
try:
|
||||
tasks = get_scheduler_status(task_type, days)
|
||||
return {"data": tasks}
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"获取任务状态失败: {e}")
|
||||
return {"error": f"获取任务状态失败: {str(e)}"}
|
||||
|
||||
|
||||
@router.post("/sync")
|
||||
async def sync_market_data():
|
||||
"""同步市场数据"""
|
||||
try:
|
||||
# 更新股票列表
|
||||
stocks = market_service.get_all_stock_list(force_refresh=True)
|
||||
stock_count = len(stocks)
|
||||
|
||||
# 更新概念分类
|
||||
concept_count = market_service.update_stock_sectors()
|
||||
|
||||
# 更新当日K线数据
|
||||
kline_result = kline_service.batch_update_kline_data(days_back=1)
|
||||
|
||||
return {
|
||||
"message": "市场数据同步完成",
|
||||
"stocks_updated": stock_count,
|
||||
"concepts_updated": concept_count,
|
||||
"kline_updated": kline_result
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"同步市场数据失败: {e}")
|
||||
return {"error": f"同步市场数据失败: {str(e)}"}
|
||||
|
||||
|
||||
@router.get("/statistics")
|
||||
async def get_market_statistics(
|
||||
days: int = Query(30, description="统计天数")
|
||||
):
|
||||
"""获取市场统计数据"""
|
||||
try:
|
||||
from app.database import DatabaseManager
|
||||
from datetime import datetime, timedelta
|
||||
|
||||
db_manager = DatabaseManager()
|
||||
with db_manager.get_connection() as conn:
|
||||
cursor = conn.cursor(dictionary=True)
|
||||
|
||||
start_date = (datetime.now() - timedelta(days=days)).date()
|
||||
|
||||
# 获取市场统计数据
|
||||
query = """
|
||||
SELECT stat_date, market_code, total_stocks, up_stocks, down_stocks,
|
||||
flat_stocks, total_amount, total_volume
|
||||
FROM market_statistics
|
||||
WHERE stat_date >= %s
|
||||
ORDER BY stat_date DESC, market_code
|
||||
"""
|
||||
cursor.execute(query, (start_date,))
|
||||
stats = cursor.fetchall()
|
||||
|
||||
# 获取行业分布统计
|
||||
cursor.execute("""
|
||||
SELECT i.industry_name, COUNT(s.stock_code) as stock_count
|
||||
FROM stocks s
|
||||
LEFT JOIN industries i ON s.industry_code = i.industry_code
|
||||
WHERE s.is_active = TRUE AND i.industry_name IS NOT NULL
|
||||
GROUP BY i.industry_name
|
||||
ORDER BY stock_count DESC
|
||||
""")
|
||||
industry_stats = cursor.fetchall()
|
||||
|
||||
# 获取市场规模统计
|
||||
cursor.execute("""
|
||||
SELECT market_type, COUNT(*) as stock_count
|
||||
FROM stocks
|
||||
WHERE is_active = TRUE
|
||||
GROUP BY market_type
|
||||
""")
|
||||
market_type_stats = cursor.fetchall()
|
||||
|
||||
cursor.close()
|
||||
|
||||
return {
|
||||
"statistics": stats,
|
||||
"industry_distribution": industry_stats,
|
||||
"market_type_distribution": market_type_stats,
|
||||
"period_days": days
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"获取市场统计数据失败: {e}")
|
||||
return {"error": f"获取市场统计数据失败: {str(e)}"}
|
||||
@ -40,6 +40,10 @@ async def get_index_info():
|
||||
async def market(request: Request):
|
||||
return templates.TemplateResponse("market.html", {"request": request})
|
||||
|
||||
@router.get("/stocks")
|
||||
async def stocks(request: Request):
|
||||
return templates.TemplateResponse("stocks_simple.html", {"request": request})
|
||||
|
||||
@router.get("/api/company_detail/{stock_code}")
|
||||
async def get_company_detail(stock_code: str):
|
||||
return stock_service.get_company_detail(stock_code)
|
||||
|
||||
17
app/dao/__init__.py
Normal file
17
app/dao/__init__.py
Normal file
@ -0,0 +1,17 @@
|
||||
"""
|
||||
数据访问对象模块
|
||||
"""
|
||||
|
||||
from .base_dao import BaseDAO
|
||||
from .stock_dao import StockDAO
|
||||
from .watchlist_dao import WatchlistDAO
|
||||
from .ai_analysis_dao import AIAnalysisDAO
|
||||
from .config_dao import ConfigDAO
|
||||
|
||||
__all__ = [
|
||||
'BaseDAO',
|
||||
'StockDAO',
|
||||
'WatchlistDAO',
|
||||
'AIAnalysisDAO',
|
||||
'ConfigDAO'
|
||||
]
|
||||
219
app/dao/ai_analysis_dao.py
Normal file
219
app/dao/ai_analysis_dao.py
Normal file
@ -0,0 +1,219 @@
|
||||
"""
|
||||
AI分析数据访问对象
|
||||
"""
|
||||
from typing import Dict, List, Optional, Any
|
||||
import json
|
||||
from datetime import datetime, date
|
||||
|
||||
from .base_dao import BaseDAO
|
||||
|
||||
|
||||
class AIAnalysisDAO(BaseDAO):
|
||||
"""AI分析数据访问对象"""
|
||||
|
||||
def save_analysis(self, stock_code: str, analysis_type: str, analysis_data: Dict,
|
||||
analysis_date: str = None) -> bool:
|
||||
"""保存AI分析结果"""
|
||||
if analysis_date is None:
|
||||
analysis_date = self.get_today_date()
|
||||
|
||||
try:
|
||||
# 检查是否已存在当日的分析
|
||||
existing = self.get_analysis(stock_code, analysis_type, analysis_date)
|
||||
|
||||
investment_summary = analysis_data.get('investment_suggestion', {})
|
||||
investment_key_points = json.dumps(investment_summary.get('key_points', []), ensure_ascii=False)
|
||||
investment_summary_text = investment_summary.get('summary', '')
|
||||
investment_action = investment_summary.get('action', '')
|
||||
|
||||
price_analysis = analysis_data.get('price_analysis', {})
|
||||
|
||||
if existing:
|
||||
# 更新现有分析
|
||||
query = """
|
||||
UPDATE ai_analysis SET
|
||||
investment_summary = %s,
|
||||
investment_action = %s,
|
||||
investment_key_points = %s,
|
||||
valuation_analysis = %s,
|
||||
financial_analysis = %s,
|
||||
growth_analysis = %s,
|
||||
risk_analysis = %s,
|
||||
reasonable_price_min = %s,
|
||||
reasonable_price_max = %s,
|
||||
target_market_value_min = %s,
|
||||
target_market_value_max = %s,
|
||||
from_cache = %s,
|
||||
updated_at = CURRENT_TIMESTAMP
|
||||
WHERE stock_code = %s AND analysis_type = %s AND analysis_date = %s
|
||||
"""
|
||||
self._execute_update(query, (
|
||||
investment_summary_text,
|
||||
investment_action,
|
||||
investment_key_points,
|
||||
analysis_data.get('analysis', {}).get('估值分析', ''),
|
||||
analysis_data.get('analysis', {}).get('财务健康状况', ''),
|
||||
analysis_data.get('analysis', {}).get('成长潜力', ''),
|
||||
analysis_data.get('analysis', {}).get('风险评估', ''),
|
||||
self.parse_float(price_analysis.get('合理价格区间', [None, None])[0]),
|
||||
self.parse_float(price_analysis.get('合理价格区间', [None, None])[1]),
|
||||
self.parse_float(price_analysis.get('目标市值区间', [None, None])[0]),
|
||||
self.parse_float(price_analysis.get('目标市值区间', [None, None])[1]),
|
||||
bool(analysis_data.get('from_cache', False)),
|
||||
stock_code,
|
||||
analysis_type,
|
||||
analysis_date
|
||||
))
|
||||
else:
|
||||
# 插入新分析
|
||||
query = """
|
||||
INSERT INTO ai_analysis (
|
||||
stock_code, analysis_type, analysis_date,
|
||||
investment_summary, investment_action, investment_key_points,
|
||||
valuation_analysis, financial_analysis, growth_analysis, risk_analysis,
|
||||
reasonable_price_min, reasonable_price_max,
|
||||
target_market_value_min, target_market_value_max, from_cache
|
||||
) VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s)
|
||||
"""
|
||||
self._execute_insert(query, (
|
||||
stock_code, analysis_type, analysis_date,
|
||||
investment_summary_text, investment_action, investment_key_points,
|
||||
analysis_data.get('analysis', {}).get('估值分析', ''),
|
||||
analysis_data.get('analysis', {}).get('财务健康状况', ''),
|
||||
analysis_data.get('analysis', {}).get('成长潜力', ''),
|
||||
analysis_data.get('analysis', {}).get('风险评估', ''),
|
||||
self.parse_float(price_analysis.get('合理价格区间', [None, None])[0]),
|
||||
self.parse_float(price_analysis.get('合理价格区间', [None, None])[1]),
|
||||
self.parse_float(price_analysis.get('目标市值区间', [None, None])[0]),
|
||||
self.parse_float(price_analysis.get('目标市值区间', [None, None])[1]),
|
||||
bool(analysis_data.get('from_cache', False))
|
||||
))
|
||||
|
||||
self.log_data_update(f'ai_analysis_{analysis_type}', stock_code, 'success', 'Analysis saved')
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"保存AI分析失败: {stock_code}, {analysis_type}, 错误: {e}")
|
||||
self.log_data_update(f'ai_analysis_{analysis_type}', stock_code, 'failed', str(e))
|
||||
return False
|
||||
|
||||
def get_analysis(self, stock_code: str, analysis_type: str, analysis_date: str = None) -> Optional[Dict]:
|
||||
"""获取AI分析结果"""
|
||||
if analysis_date is None:
|
||||
analysis_date = self.get_today_date()
|
||||
|
||||
query = """
|
||||
SELECT * FROM ai_analysis
|
||||
WHERE stock_code = %s AND analysis_type = %s AND analysis_date = %s
|
||||
"""
|
||||
return self._execute_single_query(query, (stock_code, analysis_type, analysis_date))
|
||||
|
||||
def get_latest_analysis(self, stock_code: str, analysis_type: str) -> Optional[Dict]:
|
||||
"""获取最新的AI分析结果"""
|
||||
query = """
|
||||
SELECT * FROM ai_analysis
|
||||
WHERE stock_code = %s AND analysis_type = %s
|
||||
ORDER BY analysis_date DESC
|
||||
LIMIT 1
|
||||
"""
|
||||
return self._execute_single_query(query, (stock_code, analysis_type))
|
||||
|
||||
def get_all_analysis_types(self, stock_code: str, analysis_date: str = None) -> List[Dict]:
|
||||
"""获取股票的所有类型分析"""
|
||||
if analysis_date is None:
|
||||
analysis_date = self.get_today_date()
|
||||
|
||||
query = """
|
||||
SELECT * FROM ai_analysis
|
||||
WHERE stock_code = %s AND analysis_date = %s
|
||||
ORDER BY analysis_type
|
||||
"""
|
||||
return self._execute_query(query, (stock_code, analysis_date))
|
||||
|
||||
def format_analysis_data(self, analysis_record: Dict) -> Dict:
|
||||
"""将数据库记录格式化为原始分析数据格式"""
|
||||
if not analysis_record:
|
||||
return {}
|
||||
|
||||
# 解析JSON字段
|
||||
key_points = []
|
||||
if analysis_record.get('investment_key_points'):
|
||||
try:
|
||||
key_points = json.loads(analysis_record['investment_key_points'])
|
||||
except json.JSONDecodeError:
|
||||
key_points = []
|
||||
|
||||
# 构建投资建议
|
||||
investment_suggestion = {
|
||||
'summary': analysis_record.get('investment_summary', ''),
|
||||
'action': analysis_record.get('investment_action', ''),
|
||||
'key_points': key_points
|
||||
}
|
||||
|
||||
# 构建分析详情
|
||||
analysis = {}
|
||||
if analysis_record.get('valuation_analysis'):
|
||||
analysis['估值分析'] = analysis_record['valuation_analysis']
|
||||
if analysis_record.get('financial_analysis'):
|
||||
analysis['财务健康状况'] = analysis_record['financial_analysis']
|
||||
if analysis_record.get('growth_analysis'):
|
||||
analysis['成长潜力'] = analysis_record['growth_analysis']
|
||||
if analysis_record.get('risk_analysis'):
|
||||
analysis['风险评估'] = analysis_record['risk_analysis']
|
||||
|
||||
# 构建价格分析
|
||||
price_analysis = {}
|
||||
if analysis_record.get('reasonable_price_min') or analysis_record.get('reasonable_price_max'):
|
||||
price_analysis['合理价格区间'] = [
|
||||
analysis_record.get('reasonable_price_min'),
|
||||
analysis_record.get('reasonable_price_max')
|
||||
]
|
||||
if analysis_record.get('target_market_value_min') or analysis_record.get('target_market_value_max'):
|
||||
price_analysis['目标市值区间'] = [
|
||||
analysis_record.get('target_market_value_min'),
|
||||
analysis_record.get('target_market_value_max')
|
||||
]
|
||||
|
||||
return {
|
||||
'investment_suggestion': investment_suggestion,
|
||||
'analysis': analysis,
|
||||
'price_analysis': price_analysis,
|
||||
'from_cache': analysis_record.get('from_cache', False)
|
||||
}
|
||||
|
||||
def get_analysis_history(self, stock_code: str, analysis_type: str,
|
||||
days: int = 30) -> List[Dict]:
|
||||
"""获取分析历史"""
|
||||
query = """
|
||||
SELECT * FROM ai_analysis
|
||||
WHERE stock_code = %s AND analysis_type = %s
|
||||
AND analysis_date >= DATE_SUB(CURDATE(), INTERVAL %s DAY)
|
||||
ORDER BY analysis_date DESC
|
||||
"""
|
||||
return self._execute_query(query, (stock_code, analysis_type, days))
|
||||
|
||||
def delete_analysis(self, stock_code: str, analysis_type: str,
|
||||
before_date: str = None) -> int:
|
||||
"""删除分析数据"""
|
||||
if before_date:
|
||||
query = """
|
||||
DELETE FROM ai_analysis
|
||||
WHERE stock_code = %s AND analysis_type = %s AND analysis_date < %s
|
||||
"""
|
||||
return self._execute_update(query, (stock_code, analysis_type, before_date))
|
||||
else:
|
||||
query = """
|
||||
DELETE FROM ai_analysis
|
||||
WHERE stock_code = %s AND analysis_type = %s
|
||||
"""
|
||||
return self._execute_update(query, (stock_code, analysis_type))
|
||||
|
||||
def get_analysis_count(self, analysis_type: str = None) -> int:
|
||||
"""获取分析数量"""
|
||||
if analysis_type:
|
||||
query = "SELECT COUNT(*) as count FROM ai_analysis WHERE analysis_type = %s"
|
||||
result = self._execute_single_query(query, (analysis_type,))
|
||||
else:
|
||||
query = "SELECT COUNT(*) as count FROM ai_analysis"
|
||||
result = self._execute_single_query(query)
|
||||
return result['count'] if result else 0
|
||||
114
app/dao/base_dao.py
Normal file
114
app/dao/base_dao.py
Normal file
@ -0,0 +1,114 @@
|
||||
"""
|
||||
基础数据访问对象
|
||||
"""
|
||||
from abc import ABC, abstractmethod
|
||||
from typing import Dict, Any, Optional, List
|
||||
import logging
|
||||
from datetime import datetime, date
|
||||
|
||||
from app.database import DatabaseManager
|
||||
|
||||
|
||||
class BaseDAO(ABC):
|
||||
"""数据访问对象基类"""
|
||||
|
||||
def __init__(self):
|
||||
self.db_manager = DatabaseManager()
|
||||
self.logger = logging.getLogger(self.__class__.__name__)
|
||||
|
||||
def _execute_query(self, query: str, params: Optional[tuple] = None) -> List[Dict]:
|
||||
"""执行查询语句"""
|
||||
try:
|
||||
with self.db_manager.get_cursor() as cursor:
|
||||
cursor.execute(query, params)
|
||||
return cursor.fetchall()
|
||||
except Exception as e:
|
||||
self.logger.error(f"查询执行失败: {query}, 参数: {params}, 错误: {e}")
|
||||
raise
|
||||
|
||||
def _execute_single_query(self, query: str, params: Optional[tuple] = None) -> Optional[Dict]:
|
||||
"""执行单条记录查询"""
|
||||
try:
|
||||
with self.db_manager.get_cursor() as cursor:
|
||||
cursor.execute(query, params)
|
||||
return cursor.fetchone()
|
||||
except Exception as e:
|
||||
self.logger.error(f"单条查询执行失败: {query}, 参数: {params}, 错误: {e}")
|
||||
raise
|
||||
|
||||
def _execute_update(self, query: str, params: Optional[tuple] = None) -> int:
|
||||
"""执行更新语句,返回影响的行数"""
|
||||
try:
|
||||
with self.db_manager.get_connection() as conn:
|
||||
with conn.cursor() as cursor:
|
||||
cursor.execute(query, params)
|
||||
affected_rows = cursor.rowcount
|
||||
conn.commit()
|
||||
return affected_rows
|
||||
except Exception as e:
|
||||
self.logger.error(f"更新执行失败: {query}, 参数: {params}, 错误: {e}")
|
||||
raise
|
||||
|
||||
def _execute_insert(self, query: str, params: Optional[tuple] = None) -> int:
|
||||
"""执行插入语句,返回插入的ID"""
|
||||
try:
|
||||
with self.db_manager.get_connection() as conn:
|
||||
with conn.cursor() as cursor:
|
||||
cursor.execute(query, params)
|
||||
inserted_id = cursor.lastrowid
|
||||
conn.commit()
|
||||
return inserted_id
|
||||
except Exception as e:
|
||||
self.logger.error(f"插入执行失败: {query}, 参数: {params}, 错误: {e}")
|
||||
raise
|
||||
|
||||
def _execute_batch_insert(self, query: str, params_list: List[tuple]) -> int:
|
||||
"""批量插入数据,返回插入的总行数"""
|
||||
if not params_list:
|
||||
return 0
|
||||
|
||||
try:
|
||||
with self.db_manager.get_connection() as conn:
|
||||
with conn.cursor() as cursor:
|
||||
cursor.executemany(query, params_list)
|
||||
affected_rows = cursor.rowcount
|
||||
conn.commit()
|
||||
return affected_rows
|
||||
except Exception as e:
|
||||
self.logger.error(f"批量插入失败: {query}, 参数数量: {len(params_list)}, 错误: {e}")
|
||||
raise
|
||||
|
||||
def log_data_update(self, data_type: str, stock_code: str, status: str,
|
||||
message: str = None, execution_time: float = None):
|
||||
"""记录数据更新日志"""
|
||||
try:
|
||||
query = """
|
||||
INSERT INTO data_update_log
|
||||
(data_type, stock_code, update_status, update_message, execution_time)
|
||||
VALUES (%s, %s, %s, %s, %s)
|
||||
"""
|
||||
self._execute_insert(query, (data_type, stock_code, status, message, execution_time))
|
||||
except Exception as e:
|
||||
self.logger.error(f"记录更新日志失败: {e}")
|
||||
|
||||
def get_today_date(self) -> str:
|
||||
"""获取今天的日期字符串"""
|
||||
return date.today().strftime('%Y-%m-%d')
|
||||
|
||||
def parse_float(self, value: Any) -> Optional[float]:
|
||||
"""解析浮点数"""
|
||||
if value is None or value == '':
|
||||
return None
|
||||
try:
|
||||
return float(value)
|
||||
except (ValueError, TypeError):
|
||||
return None
|
||||
|
||||
def parse_int(self, value: Any) -> Optional[int]:
|
||||
"""解析整数"""
|
||||
if value is None or value == '':
|
||||
return None
|
||||
try:
|
||||
return int(value)
|
||||
except (ValueError, TypeError):
|
||||
return None
|
||||
171
app/dao/config_dao.py
Normal file
171
app/dao/config_dao.py
Normal file
@ -0,0 +1,171 @@
|
||||
"""
|
||||
系统配置数据访问对象
|
||||
"""
|
||||
from typing import Dict, List, Optional, Any
|
||||
import json
|
||||
from datetime import datetime, date
|
||||
|
||||
from .base_dao import BaseDAO
|
||||
|
||||
|
||||
class ConfigDAO(BaseDAO):
|
||||
"""系统配置数据访问对象"""
|
||||
|
||||
def get_config(self, key: str, default_value: Any = None) -> Any:
|
||||
"""获取配置值"""
|
||||
query = "SELECT config_value, config_type FROM system_config WHERE config_key = %s"
|
||||
result = self._execute_single_query(query, (key,))
|
||||
|
||||
if not result:
|
||||
return default_value
|
||||
|
||||
config_value = result['config_value']
|
||||
config_type = result['config_type']
|
||||
|
||||
# 根据类型转换值
|
||||
if config_type == 'integer':
|
||||
try:
|
||||
return int(config_value) if config_value else default_value
|
||||
except (ValueError, TypeError):
|
||||
return default_value
|
||||
elif config_type == 'float':
|
||||
try:
|
||||
return float(config_value) if config_value else default_value
|
||||
except (ValueError, TypeError):
|
||||
return default_value
|
||||
elif config_type == 'boolean':
|
||||
return config_value.lower() in ('true', '1', 'yes', 'on') if config_value else default_value
|
||||
elif config_type == 'json':
|
||||
try:
|
||||
return json.loads(config_value) if config_value else default_value
|
||||
except json.JSONDecodeError:
|
||||
return default_value
|
||||
else: # string
|
||||
return config_value if config_value else default_value
|
||||
|
||||
def set_config(self, key: str, value: Any, config_type: str = 'string') -> bool:
|
||||
"""设置配置值"""
|
||||
try:
|
||||
# 转换值为字符串存储
|
||||
if config_type == 'json':
|
||||
str_value = json.dumps(value, ensure_ascii=False)
|
||||
elif config_type == 'boolean':
|
||||
str_value = str(value).lower()
|
||||
else:
|
||||
str_value = str(value)
|
||||
|
||||
# 检查配置是否存在
|
||||
existing = self._execute_single_query(
|
||||
"SELECT id FROM system_config WHERE config_key = %s", (key,)
|
||||
)
|
||||
|
||||
if existing:
|
||||
# 更新现有配置
|
||||
query = """
|
||||
UPDATE system_config
|
||||
SET config_value = %s, config_type = %s, updated_at = CURRENT_TIMESTAMP
|
||||
WHERE config_key = %s
|
||||
"""
|
||||
self._execute_update(query, (str_value, config_type, key))
|
||||
else:
|
||||
# 插入新配置
|
||||
query = """
|
||||
INSERT INTO system_config (config_key, config_value, config_type)
|
||||
VALUES (%s, %s, %s)
|
||||
"""
|
||||
self._execute_insert(query, (key, str_value, config_type))
|
||||
|
||||
self.log_data_update('config', key, 'success', f'Config updated: {key}={value}')
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"设置配置失败: {key}={value}, 错误: {e}")
|
||||
self.log_data_update('config', key, 'failed', str(e))
|
||||
return False
|
||||
|
||||
def get_all_configs(self) -> Dict[str, Dict]:
|
||||
"""获取所有配置"""
|
||||
query = "SELECT * FROM system_config ORDER BY config_key"
|
||||
results = self._execute_query(query)
|
||||
|
||||
configs = {}
|
||||
for result in results:
|
||||
key = result['config_key']
|
||||
configs[key] = {
|
||||
'value': self.get_config(key),
|
||||
'type': result['config_type'],
|
||||
'created_at': result['created_at'],
|
||||
'updated_at': result['updated_at']
|
||||
}
|
||||
|
||||
return configs
|
||||
|
||||
def delete_config(self, key: str) -> bool:
|
||||
"""删除配置"""
|
||||
try:
|
||||
query = "DELETE FROM system_config WHERE config_key = %s"
|
||||
affected_rows = self._execute_update(query, (key,))
|
||||
success = affected_rows > 0
|
||||
|
||||
if success:
|
||||
self.log_data_update('config', key, 'success', 'Config deleted')
|
||||
else:
|
||||
self.log_data_update('config', key, 'failed', 'Config not found')
|
||||
|
||||
return success
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"删除配置失败: {key}, 错误: {e}")
|
||||
self.log_data_update('config', key, 'failed', str(e))
|
||||
return False
|
||||
|
||||
def increment_counter(self, key: str, increment: int = 1) -> int:
|
||||
"""递增计数器配置"""
|
||||
try:
|
||||
current_value = self.get_config(key, 0)
|
||||
new_value = current_value + increment
|
||||
self.set_config(key, new_value, 'integer')
|
||||
return new_value
|
||||
except Exception as e:
|
||||
self.logger.error(f"递增计数器失败: {key}, 错误: {e}")
|
||||
return 0
|
||||
|
||||
def reset_daily_counters(self) -> None:
|
||||
"""重置每日计数器"""
|
||||
daily_counters = [
|
||||
'tushare_api_calls_today',
|
||||
]
|
||||
|
||||
for counter in daily_counters:
|
||||
self.set_config(counter, 0, 'integer')
|
||||
|
||||
# 更新最后重置日期
|
||||
self.set_config('last_counter_reset_date', self.get_today_date(), 'date')
|
||||
|
||||
def get_last_data_update_date(self) -> Optional[str]:
|
||||
"""获取最后数据更新日期"""
|
||||
return self.get_config('last_data_update_date')
|
||||
|
||||
def set_last_data_update_date(self, date_str: str) -> bool:
|
||||
"""设置最后数据更新日期"""
|
||||
return self.set_config('last_data_update_date', date_str, 'date')
|
||||
|
||||
def get_cache_expiration_hours(self) -> int:
|
||||
"""获取缓存过期时间(小时)"""
|
||||
return self.get_config('cache_expiration_hours', 24)
|
||||
|
||||
def get_max_watchlist_size(self) -> int:
|
||||
"""获取最大监控列表大小"""
|
||||
return self.get_config('max_watchlist_size', 50)
|
||||
|
||||
def is_cache_expired(self, data_date: str) -> bool:
|
||||
"""检查缓存是否过期"""
|
||||
try:
|
||||
cache_hours = self.get_cache_expiration_hours()
|
||||
current_date = date.today()
|
||||
data_date_obj = datetime.strptime(data_date, '%Y-%m-%d').date()
|
||||
|
||||
days_diff = (current_date - data_date_obj).days
|
||||
return days_diff > 0 # 如果不是今天的数据,就算过期
|
||||
except Exception:
|
||||
return True # 如果无法解析日期,认为过期
|
||||
208
app/dao/stock_dao.py
Normal file
208
app/dao/stock_dao.py
Normal file
@ -0,0 +1,208 @@
|
||||
"""
|
||||
股票数据访问对象
|
||||
"""
|
||||
from typing import Dict, List, Optional, Tuple
|
||||
import json
|
||||
from datetime import datetime, date
|
||||
|
||||
from .base_dao import BaseDAO
|
||||
|
||||
|
||||
class StockDAO(BaseDAO):
|
||||
"""股票数据访问对象"""
|
||||
|
||||
def get_stock_by_code(self, stock_code: str) -> Optional[Dict]:
|
||||
"""根据股票代码获取股票信息"""
|
||||
query = "SELECT * FROM stocks WHERE stock_code = %s"
|
||||
return self._execute_single_query(query, (stock_code,))
|
||||
|
||||
def add_or_update_stock(self, stock_code: str, stock_name: str, market: str) -> int:
|
||||
"""添加或更新股票信息"""
|
||||
existing = self.get_stock_by_code(stock_code)
|
||||
|
||||
if existing:
|
||||
# 更新现有股票
|
||||
query = """
|
||||
UPDATE stocks
|
||||
SET stock_name = %s, market = %s, updated_at = CURRENT_TIMESTAMP
|
||||
WHERE stock_code = %s
|
||||
"""
|
||||
self._execute_update(query, (stock_name, market, stock_code))
|
||||
return existing['id']
|
||||
else:
|
||||
# 添加新股票
|
||||
query = """
|
||||
INSERT INTO stocks (stock_code, stock_name, market)
|
||||
VALUES (%s, %s, %s)
|
||||
"""
|
||||
return self._execute_insert(query, (stock_code, stock_name, market))
|
||||
|
||||
def get_stock_data(self, stock_code: str, data_date: str = None) -> Optional[Dict]:
|
||||
"""获取股票数据"""
|
||||
if data_date is None:
|
||||
data_date = self.get_today_date()
|
||||
|
||||
query = """
|
||||
SELECT sd.*, s.stock_name
|
||||
FROM stock_data sd
|
||||
JOIN stocks s ON sd.stock_code = s.stock_code
|
||||
WHERE sd.stock_code = %s AND sd.data_date = %s
|
||||
"""
|
||||
return self._execute_single_query(query, (stock_code, data_date))
|
||||
|
||||
def save_stock_data(self, stock_code: str, stock_info: Dict, data_date: str = None) -> bool:
|
||||
"""保存股票数据"""
|
||||
if data_date is None:
|
||||
data_date = self.get_today_date()
|
||||
|
||||
try:
|
||||
# 确保股票信息存在
|
||||
self.add_or_update_stock(
|
||||
stock_code,
|
||||
stock_info.get('name', ''),
|
||||
'SH' if stock_code.startswith('6') else 'SZ'
|
||||
)
|
||||
|
||||
# 检查是否已存在当日数据
|
||||
existing = self.get_stock_data(stock_code, data_date)
|
||||
|
||||
if existing:
|
||||
# 更新现有数据
|
||||
query = """
|
||||
UPDATE stock_data SET
|
||||
price = %s,
|
||||
change_percent = %s,
|
||||
market_value = %s,
|
||||
pe_ratio = %s,
|
||||
pb_ratio = %s,
|
||||
ps_ratio = %s,
|
||||
dividend_yield = %s,
|
||||
roe = %s,
|
||||
gross_profit_margin = %s,
|
||||
net_profit_margin = %s,
|
||||
debt_to_assets = %s,
|
||||
revenue_yoy = %s,
|
||||
net_profit_yoy = %s,
|
||||
bps = %s,
|
||||
ocfps = %s,
|
||||
from_cache = %s,
|
||||
updated_at = CURRENT_TIMESTAMP
|
||||
WHERE stock_code = %s AND data_date = %s
|
||||
"""
|
||||
self._execute_update(query, (
|
||||
self.parse_float(stock_info.get('price')),
|
||||
self.parse_float(stock_info.get('change_percent')),
|
||||
self.parse_float(stock_info.get('market_value')),
|
||||
self.parse_float(stock_info.get('pe_ratio')),
|
||||
self.parse_float(stock_info.get('pb_ratio')),
|
||||
self.parse_float(stock_info.get('ps_ratio')),
|
||||
self.parse_float(stock_info.get('dividend_yield')),
|
||||
self.parse_float(stock_info.get('roe')),
|
||||
self.parse_float(stock_info.get('gross_profit_margin')),
|
||||
self.parse_float(stock_info.get('net_profit_margin')),
|
||||
self.parse_float(stock_info.get('debt_to_assets')),
|
||||
self.parse_float(stock_info.get('revenue_yoy')),
|
||||
self.parse_float(stock_info.get('net_profit_yoy')),
|
||||
self.parse_float(stock_info.get('bps')),
|
||||
self.parse_float(stock_info.get('ocfps')),
|
||||
bool(stock_info.get('from_cache', False)),
|
||||
stock_code,
|
||||
data_date
|
||||
))
|
||||
else:
|
||||
# 插入新数据
|
||||
query = """
|
||||
INSERT INTO stock_data (
|
||||
stock_code, data_date, price, change_percent, market_value,
|
||||
pe_ratio, pb_ratio, ps_ratio, dividend_yield,
|
||||
roe, gross_profit_margin, net_profit_margin, debt_to_assets,
|
||||
revenue_yoy, net_profit_yoy, bps, ocfps, from_cache
|
||||
) VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s)
|
||||
"""
|
||||
self._execute_insert(query, (
|
||||
stock_code, data_date,
|
||||
self.parse_float(stock_info.get('price')),
|
||||
self.parse_float(stock_info.get('change_percent')),
|
||||
self.parse_float(stock_info.get('market_value')),
|
||||
self.parse_float(stock_info.get('pe_ratio')),
|
||||
self.parse_float(stock_info.get('pb_ratio')),
|
||||
self.parse_float(stock_info.get('ps_ratio')),
|
||||
self.parse_float(stock_info.get('dividend_yield')),
|
||||
self.parse_float(stock_info.get('roe')),
|
||||
self.parse_float(stock_info.get('gross_profit_margin')),
|
||||
self.parse_float(stock_info.get('net_profit_margin')),
|
||||
self.parse_float(stock_info.get('debt_to_assets')),
|
||||
self.parse_float(stock_info.get('revenue_yoy')),
|
||||
self.parse_float(stock_info.get('net_profit_yoy')),
|
||||
self.parse_float(stock_info.get('bps')),
|
||||
self.parse_float(stock_info.get('ocfps')),
|
||||
bool(stock_info.get('from_cache', False))
|
||||
))
|
||||
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"保存股票数据失败: {stock_code}, 错误: {e}")
|
||||
self.log_data_update('stock_data', stock_code, 'failed', str(e))
|
||||
return False
|
||||
|
||||
def get_latest_stock_data(self, stock_code: str) -> Optional[Dict]:
|
||||
"""获取最新的股票数据"""
|
||||
query = """
|
||||
SELECT sd.*, s.stock_name
|
||||
FROM stock_data sd
|
||||
JOIN stocks s ON sd.stock_code = s.stock_code
|
||||
WHERE sd.stock_code = %s
|
||||
ORDER BY sd.data_date DESC
|
||||
LIMIT 1
|
||||
"""
|
||||
return self._execute_single_query(query, (stock_code,))
|
||||
|
||||
def get_multiple_stocks_data(self, stock_codes: List[str], data_date: str = None) -> List[Dict]:
|
||||
"""批量获取股票数据"""
|
||||
if not stock_codes:
|
||||
return []
|
||||
|
||||
if data_date is None:
|
||||
data_date = self.get_today_date()
|
||||
|
||||
placeholders = ','.join(['%s'] * len(stock_codes))
|
||||
query = f"""
|
||||
SELECT sd.*, s.stock_name
|
||||
FROM stock_data sd
|
||||
JOIN stocks s ON sd.stock_code = s.stock_code
|
||||
WHERE sd.stock_code IN ({placeholders}) AND sd.data_date = %s
|
||||
"""
|
||||
|
||||
return self._execute_query(query, tuple(stock_codes + [data_date]))
|
||||
|
||||
def get_stock_data_history(self, stock_code: str, days: int = 30) -> List[Dict]:
|
||||
"""获取股票历史数据"""
|
||||
query = """
|
||||
SELECT sd.*, s.stock_name
|
||||
FROM stock_data sd
|
||||
JOIN stocks s ON sd.stock_code = s.stock_code
|
||||
WHERE sd.stock_code = %s AND sd.data_date >= DATE_SUB(CURDATE(), INTERVAL %s DAY)
|
||||
ORDER BY sd.data_date DESC
|
||||
"""
|
||||
return self._execute_query(query, (stock_code, days))
|
||||
|
||||
def delete_stock_data(self, stock_code: str, before_date: str = None) -> int:
|
||||
"""删除股票数据"""
|
||||
if before_date:
|
||||
query = "DELETE FROM stock_data WHERE stock_code = %s AND data_date < %s"
|
||||
return self._execute_update(query, (stock_code, before_date))
|
||||
else:
|
||||
query = "DELETE FROM stock_data WHERE stock_code = %s"
|
||||
return self._execute_update(query, (stock_code,))
|
||||
|
||||
def get_stock_count(self) -> int:
|
||||
"""获取股票总数"""
|
||||
query = "SELECT COUNT(*) as count FROM stocks"
|
||||
result = self._execute_single_query(query)
|
||||
return result['count'] if result else 0
|
||||
|
||||
def get_data_date_range(self) -> Optional[Dict]:
|
||||
"""获取数据的日期范围"""
|
||||
query = "SELECT MIN(data_date) as min_date, MAX(data_date) as max_date FROM stock_data"
|
||||
return self._execute_single_query(query)
|
||||
172
app/dao/watchlist_dao.py
Normal file
172
app/dao/watchlist_dao.py
Normal file
@ -0,0 +1,172 @@
|
||||
"""
|
||||
监控列表数据访问对象
|
||||
"""
|
||||
from typing import Dict, List, Optional, Tuple
|
||||
from datetime import datetime
|
||||
|
||||
from .base_dao import BaseDAO
|
||||
|
||||
|
||||
class WatchlistDAO(BaseDAO):
|
||||
"""监控列表数据访问对象"""
|
||||
|
||||
def get_watchlist(self) -> List[Dict]:
|
||||
"""获取完整的监控列表,包含股票信息"""
|
||||
query = """
|
||||
SELECT
|
||||
w.stock_code,
|
||||
s.stock_name,
|
||||
s.market,
|
||||
w.target_market_value_min,
|
||||
w.target_market_value_max,
|
||||
w.created_at,
|
||||
w.updated_at
|
||||
FROM watchlist w
|
||||
JOIN stocks s ON w.stock_code = s.stock_code
|
||||
ORDER BY w.created_at DESC
|
||||
"""
|
||||
return self._execute_query(query)
|
||||
|
||||
def add_to_watchlist(self, stock_code: str, target_min: float = None,
|
||||
target_max: float = None) -> bool:
|
||||
"""添加股票到监控列表"""
|
||||
try:
|
||||
# 检查是否已在监控列表中
|
||||
existing = self.get_watchlist_item(stock_code)
|
||||
if existing:
|
||||
# 更新现有的目标市值
|
||||
return self.update_watchlist_item(stock_code, target_min, target_max)
|
||||
|
||||
# 添加新项到监控列表
|
||||
query = """
|
||||
INSERT INTO watchlist (stock_code, target_market_value_min, target_market_value_max)
|
||||
VALUES (%s, %s, %s)
|
||||
"""
|
||||
self._execute_insert(query, (stock_code, target_min, target_max))
|
||||
|
||||
self.log_data_update('watchlist', stock_code, 'success', 'Added to watchlist')
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"添加到监控列表失败: {stock_code}, 错误: {e}")
|
||||
self.log_data_update('watchlist', stock_code, 'failed', str(e))
|
||||
return False
|
||||
|
||||
def remove_from_watchlist(self, stock_code: str) -> bool:
|
||||
"""从监控列表移除股票"""
|
||||
try:
|
||||
query = "DELETE FROM watchlist WHERE stock_code = %s"
|
||||
affected_rows = self._execute_update(query, (stock_code,))
|
||||
success = affected_rows > 0
|
||||
|
||||
if success:
|
||||
self.log_data_update('watchlist', stock_code, 'success', 'Removed from watchlist')
|
||||
else:
|
||||
self.log_data_update('watchlist', stock_code, 'failed', 'Stock not found in watchlist')
|
||||
|
||||
return success
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"从监控列表移除失败: {stock_code}, 错误: {e}")
|
||||
self.log_data_update('watchlist', stock_code, 'failed', str(e))
|
||||
return False
|
||||
|
||||
def get_watchlist_item(self, stock_code: str) -> Optional[Dict]:
|
||||
"""获取监控列表中的单个项目"""
|
||||
query = """
|
||||
SELECT
|
||||
w.stock_code,
|
||||
s.stock_name,
|
||||
s.market,
|
||||
w.target_market_value_min,
|
||||
w.target_market_value_max,
|
||||
w.created_at,
|
||||
w.updated_at
|
||||
FROM watchlist w
|
||||
JOIN stocks s ON w.stock_code = s.stock_code
|
||||
WHERE w.stock_code = %s
|
||||
"""
|
||||
return self._execute_single_query(query, (stock_code,))
|
||||
|
||||
def update_watchlist_item(self, stock_code: str, target_min: float = None,
|
||||
target_max: float = None) -> bool:
|
||||
"""更新监控列表项目"""
|
||||
try:
|
||||
query = """
|
||||
UPDATE watchlist
|
||||
SET target_market_value_min = %s,
|
||||
target_market_value_max = %s,
|
||||
updated_at = CURRENT_TIMESTAMP
|
||||
WHERE stock_code = %s
|
||||
"""
|
||||
affected_rows = self._execute_update(query, (target_min, target_max, stock_code))
|
||||
success = affected_rows > 0
|
||||
|
||||
if success:
|
||||
self.log_data_update('watchlist', stock_code, 'success', 'Updated watchlist item')
|
||||
else:
|
||||
self.log_data_update('watchlist', stock_code, 'failed', 'Stock not found in watchlist')
|
||||
|
||||
return success
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"更新监控列表失败: {stock_code}, 错误: {e}")
|
||||
self.log_data_update('watchlist', stock_code, 'failed', str(e))
|
||||
return False
|
||||
|
||||
def get_watchlist_with_data(self, data_date: str = None) -> List[Dict]:
|
||||
"""获取监控列表及其股票数据"""
|
||||
if data_date is None:
|
||||
data_date = self.get_today_date()
|
||||
|
||||
query = """
|
||||
SELECT
|
||||
w.stock_code,
|
||||
s.stock_name,
|
||||
s.market,
|
||||
w.target_market_value_min,
|
||||
w.target_market_value_max,
|
||||
sd.price,
|
||||
sd.change_percent,
|
||||
sd.market_value as current_market_value,
|
||||
sd.pe_ratio,
|
||||
sd.pb_ratio,
|
||||
sd.from_cache
|
||||
FROM watchlist w
|
||||
JOIN stocks s ON w.stock_code = s.stock_code
|
||||
LEFT JOIN stock_data sd ON w.stock_code = sd.stock_code AND sd.data_date = %s
|
||||
ORDER BY w.created_at DESC
|
||||
"""
|
||||
return self._execute_query(query, (data_date,))
|
||||
|
||||
def clear_watchlist(self) -> bool:
|
||||
"""清空监控列表"""
|
||||
try:
|
||||
query = "DELETE FROM watchlist"
|
||||
self._execute_update(query)
|
||||
self.log_data_update('watchlist', 'all', 'success', 'Cleared watchlist')
|
||||
return True
|
||||
except Exception as e:
|
||||
self.logger.error(f"清空监控列表失败: {e}")
|
||||
self.log_data_update('watchlist', 'all', 'failed', str(e))
|
||||
return False
|
||||
|
||||
def get_watchlist_count(self) -> int:
|
||||
"""获取监控列表股票数量"""
|
||||
query = "SELECT COUNT(*) as count FROM watchlist"
|
||||
result = self._execute_single_query(query)
|
||||
return result['count'] if result else 0
|
||||
|
||||
def get_stocks_needing_update(self, data_date: str = None) -> List[str]:
|
||||
"""获取需要更新数据的股票代码列表"""
|
||||
if data_date is None:
|
||||
data_date = self.get_today_date()
|
||||
|
||||
query = """
|
||||
SELECT DISTINCT w.stock_code
|
||||
FROM watchlist w
|
||||
LEFT JOIN stock_data sd ON w.stock_code = sd.stock_code AND sd.data_date = %s
|
||||
WHERE sd.stock_code IS NULL OR sd.data_date < %s
|
||||
"""
|
||||
results = self._execute_query(query, (data_date, data_date))
|
||||
return [item['stock_code'] for item in results]
|
||||
1
app/models/__init__.py
Normal file
1
app/models/__init__.py
Normal file
@ -0,0 +1 @@
|
||||
# 数据模型模块
|
||||
403
app/scheduler.py
Normal file
403
app/scheduler.py
Normal file
@ -0,0 +1,403 @@
|
||||
"""
|
||||
定时任务调度器
|
||||
负责自动更新股票数据、K线数据等定时任务
|
||||
"""
|
||||
import asyncio
|
||||
import logging
|
||||
from datetime import datetime, time, timedelta
|
||||
from typing import Dict, List, Optional
|
||||
import threading
|
||||
from app.services.market_data_service import MarketDataService
|
||||
from app.services.kline_service import KlineService
|
||||
from app.services.stock_service_db import StockServiceDB
|
||||
from app.database import DatabaseManager
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class TaskScheduler:
|
||||
def __init__(self):
|
||||
self.market_service = MarketDataService()
|
||||
self.kline_service = KlineService()
|
||||
self.stock_service = StockServiceDB()
|
||||
self.db_manager = DatabaseManager()
|
||||
self.logger = logging.getLogger(__name__)
|
||||
self.running = False
|
||||
self.scheduler_thread = None
|
||||
|
||||
def start(self):
|
||||
"""启动定时任务调度器"""
|
||||
if self.running:
|
||||
self.logger.warning("任务调度器已在运行")
|
||||
return
|
||||
|
||||
self.running = True
|
||||
self.scheduler_thread = threading.Thread(target=self._run_scheduler, daemon=True)
|
||||
self.scheduler_thread.start()
|
||||
self.logger.info("任务调度器已启动")
|
||||
|
||||
def stop(self):
|
||||
"""停止定时任务调度器"""
|
||||
self.running = False
|
||||
if self.scheduler_thread:
|
||||
self.scheduler_thread.join(timeout=10)
|
||||
self.logger.info("任务调度器已停止")
|
||||
|
||||
def _run_scheduler(self):
|
||||
"""运行调度器主循环"""
|
||||
self.logger.info("任务调度器开始运行")
|
||||
|
||||
while self.running:
|
||||
try:
|
||||
current_time = datetime.now()
|
||||
|
||||
# 检查是否到了执行时间
|
||||
self._check_and_run_tasks(current_time)
|
||||
|
||||
# 每5分钟检查一次
|
||||
for _ in range(60): # 5分钟 = 300秒,每5秒检查一次
|
||||
if not self.running:
|
||||
break
|
||||
asyncio.sleep(5)
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"任务调度器运行错误: {e}")
|
||||
asyncio.sleep(30) # 出错后等待30秒再继续
|
||||
|
||||
def _check_and_run_tasks(self, current_time: datetime):
|
||||
"""检查并执行定时任务"""
|
||||
try:
|
||||
# 每日上午9:00更新股票列表(每周一)
|
||||
if current_time.weekday() == 0 and current_time.time() >= time(9, 0):
|
||||
if self._should_run_task('update_stock_list', current_time):
|
||||
self._run_task_async('update_stock_list', self._update_stock_list)
|
||||
|
||||
# 每日上午9:30更新K线数据
|
||||
if current_time.time() >= time(9, 30):
|
||||
if self._should_run_task('update_daily_kline', current_time):
|
||||
self._run_task_async('update_daily_kline', self._update_daily_kline)
|
||||
|
||||
# 每日收盘后(16:00)更新市场统计
|
||||
if current_time.time() >= time(16, 0):
|
||||
if self._should_run_task('update_market_stats', current_time):
|
||||
self._run_task_async('update_market_stats', self._update_market_statistics)
|
||||
|
||||
# 每日晚上20:00更新监控列表数据
|
||||
if current_time.time() >= time(20, 0):
|
||||
if self._should_run_task('update_watchlist', current_time):
|
||||
self._run_task_async('update_watchlist', self._update_watchlist_data)
|
||||
|
||||
# 每周日凌晨2:00清理旧数据
|
||||
if current_time.weekday() == 6 and current_time.time() >= time(2, 0):
|
||||
if self._should_run_task('clean_old_data', current_time):
|
||||
self._run_task_async('clean_old_data', self._clean_old_data)
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"检查和执行任务失败: {e}")
|
||||
|
||||
def _should_run_task(self, task_name: str, current_time: datetime) -> bool:
|
||||
"""检查任务是否应该执行(避免重复执行)"""
|
||||
try:
|
||||
with self.db_manager.get_connection() as conn:
|
||||
cursor = conn.cursor(dictionary=True)
|
||||
|
||||
# 检查今天是否已经执行过该任务
|
||||
today = current_time.date()
|
||||
query = """
|
||||
SELECT COUNT(*) as count
|
||||
FROM data_update_tasks
|
||||
WHERE task_type = %s AND DATE(created_at) = %s AND status = 'completed'
|
||||
"""
|
||||
cursor.execute(query, (task_name, today))
|
||||
result = cursor.fetchone()
|
||||
|
||||
cursor.close()
|
||||
return result['count'] == 0
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"检查任务执行状态失败: {task_name}, 错误: {e}")
|
||||
return False
|
||||
|
||||
def _run_task_async(self, task_name: str, task_func):
|
||||
"""异步执行任务"""
|
||||
def run_task():
|
||||
try:
|
||||
self._create_task_record(task_name, 'running')
|
||||
|
||||
start_time = datetime.now()
|
||||
result = task_func()
|
||||
end_time = datetime.now()
|
||||
|
||||
duration = (end_time - start_time).total_seconds()
|
||||
|
||||
if isinstance(result, dict) and 'error' in result:
|
||||
self._update_task_record(task_name, 'failed',
|
||||
error_message=result['error'],
|
||||
duration=duration)
|
||||
else:
|
||||
self._update_task_record(task_name, 'completed',
|
||||
processed_count=result.get('processed_count', 0),
|
||||
total_count=result.get('total_count', 0),
|
||||
duration=duration)
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"执行任务失败: {task_name}, 错误: {e}")
|
||||
self._update_task_record(task_name, 'failed', error_message=str(e))
|
||||
|
||||
# 在新线程中执行任务
|
||||
task_thread = threading.Thread(target=run_task, daemon=True)
|
||||
task_thread.start()
|
||||
|
||||
def _create_task_record(self, task_name: str, task_type: str):
|
||||
"""创建任务记录"""
|
||||
try:
|
||||
with self.db_manager.get_connection() as conn:
|
||||
cursor = conn.cursor()
|
||||
query = """
|
||||
INSERT INTO data_update_tasks (task_name, task_type, status, start_time)
|
||||
VALUES (%s, %s, %s, NOW())
|
||||
"""
|
||||
cursor.execute(query, (task_name, task_type, 'running'))
|
||||
conn.commit()
|
||||
cursor.close()
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"创建任务记录失败: {task_name}, 错误: {e}")
|
||||
|
||||
def _update_task_record(self, task_name: str, status: str,
|
||||
processed_count: int = 0, total_count: int = 0,
|
||||
error_message: str = None, duration: float = None):
|
||||
"""更新任务记录"""
|
||||
try:
|
||||
with self.db_manager.get_connection() as conn:
|
||||
cursor = conn.cursor()
|
||||
query = """
|
||||
UPDATE data_update_tasks
|
||||
SET status = %s, end_time = NOW(),
|
||||
processed_count = %s, total_count = %s,
|
||||
error_message = %s
|
||||
WHERE task_name = %s AND status = 'running'
|
||||
ORDER BY created_at DESC
|
||||
LIMIT 1
|
||||
"""
|
||||
cursor.execute(query, (status, processed_count, total_count, error_message, task_name))
|
||||
conn.commit()
|
||||
cursor.close()
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"更新任务记录失败: {task_name}, 错误: {e}")
|
||||
|
||||
def _update_stock_list(self) -> Dict:
|
||||
"""更新股票列表"""
|
||||
try:
|
||||
self.logger.info("开始更新股票列表")
|
||||
result = self.market_service.get_all_stock_list(force_refresh=True)
|
||||
|
||||
# 更新概念分类
|
||||
self.market_service.update_stock_sectors()
|
||||
|
||||
return {
|
||||
'total_count': len(result),
|
||||
'processed_count': len(result)
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"更新股票列表失败: {e}")
|
||||
return {'error': str(e)}
|
||||
|
||||
def _update_daily_kline(self) -> Dict:
|
||||
"""更新日K线数据"""
|
||||
try:
|
||||
self.logger.info("开始更新日K线数据")
|
||||
result = self.kline_service.batch_update_kline_data(days_back=1)
|
||||
return result
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"更新日K线数据失败: {e}")
|
||||
return {'error': str(e)}
|
||||
|
||||
def _update_watchlist_data(self) -> Dict:
|
||||
"""更新监控列表数据"""
|
||||
try:
|
||||
self.logger.info("开始更新监控列表数据")
|
||||
result = self.stock_service.batch_update_watchlist_data()
|
||||
return result
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"更新监控列表数据失败: {e}")
|
||||
return {'error': str(e)}
|
||||
|
||||
def _update_market_statistics(self) -> Dict:
|
||||
"""更新市场统计数据"""
|
||||
try:
|
||||
self.logger.info("开始更新市场统计数据")
|
||||
return self._calculate_market_stats()
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"更新市场统计数据失败: {e}")
|
||||
return {'error': str(e)}
|
||||
|
||||
def _calculate_market_stats(self) -> Dict:
|
||||
"""计算市场统计数据"""
|
||||
try:
|
||||
today = datetime.now().date()
|
||||
|
||||
with self.db_manager.get_connection() as conn:
|
||||
cursor = conn.cursor(dictionary=True)
|
||||
|
||||
# 计算市场统计
|
||||
query = """
|
||||
INSERT INTO market_statistics (
|
||||
stat_date, market_code, total_stocks, up_stocks, down_stocks,
|
||||
flat_stocks, total_volume, total_amount, created_at
|
||||
)
|
||||
SELECT
|
||||
%s as stat_date,
|
||||
market,
|
||||
COUNT(*) as total_stocks,
|
||||
SUM(CASE WHEN change_percent > 0 THEN 1 ELSE 0 END) as up_stocks,
|
||||
SUM(CASE WHEN change_percent < 0 THEN 1 ELSE 0 END) as down_stocks,
|
||||
SUM(CASE WHEN change_percent = 0 THEN 1 ELSE 0 END) as flat_stocks,
|
||||
COALESCE(SUM(volume), 0) as total_volume,
|
||||
COALESCE(SUM(amount), 0) as total_amount,
|
||||
NOW()
|
||||
FROM (
|
||||
SELECT
|
||||
CASE WHEN stock_code LIKE '6%' THEN 'SH'
|
||||
WHEN stock_code LIKE '0%' OR stock_code LIKE '3%' THEN 'SZ'
|
||||
ELSE 'OTHER' END as market,
|
||||
change_percent,
|
||||
volume,
|
||||
amount
|
||||
FROM kline_data
|
||||
WHERE kline_type = 'daily' AND trade_date = %s
|
||||
) as daily_data
|
||||
GROUP BY market
|
||||
ON DUPLICATE KEY UPDATE
|
||||
total_stocks = VALUES(total_stocks),
|
||||
up_stocks = VALUES(up_stocks),
|
||||
down_stocks = VALUES(down_stocks),
|
||||
flat_stocks = VALUES(flat_stocks),
|
||||
total_volume = VALUES(total_volume),
|
||||
total_amount = VALUES(total_amount),
|
||||
updated_at = NOW()
|
||||
"""
|
||||
|
||||
cursor.execute(query, (today, today))
|
||||
affected_rows = cursor.rowcount
|
||||
conn.commit()
|
||||
cursor.close()
|
||||
|
||||
return {
|
||||
'processed_count': affected_rows,
|
||||
'total_count': affected_rows
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"计算市场统计数据失败: {e}")
|
||||
return {'error': str(e)}
|
||||
|
||||
def _clean_old_data(self) -> Dict:
|
||||
"""清理旧数据"""
|
||||
try:
|
||||
self.logger.info("开始清理旧数据")
|
||||
|
||||
# 清理6个月前的K线数据
|
||||
deleted_count = self.kline_service.clean_old_kline_data(days_to_keep=180)
|
||||
|
||||
# 清理3个月前的任务记录
|
||||
cutoff_date = datetime.now() - timedelta(days=90)
|
||||
with self.db_manager.get_connection() as conn:
|
||||
cursor = conn.cursor()
|
||||
cursor.execute("DELETE FROM data_update_tasks WHERE created_at < %s", (cutoff_date,))
|
||||
task_deleted = cursor.rowcount
|
||||
conn.commit()
|
||||
cursor.close()
|
||||
|
||||
return {
|
||||
'processed_count': deleted_count + task_deleted,
|
||||
'deleted_kline_count': deleted_count,
|
||||
'deleted_task_count': task_deleted
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"清理旧数据失败: {e}")
|
||||
return {'error': str(e)}
|
||||
|
||||
def get_task_status(self, task_type: str = None, days: int = 7) -> List[Dict]:
|
||||
"""获取任务执行状态"""
|
||||
try:
|
||||
with self.db_manager.get_connection() as conn:
|
||||
cursor = conn.cursor(dictionary=True)
|
||||
|
||||
query = """
|
||||
SELECT task_name, task_type, status, start_time, end_time,
|
||||
processed_count, total_count, error_message,
|
||||
TIMESTAMPDIFF(SECOND, start_time, end_time) as duration_seconds
|
||||
FROM data_update_tasks
|
||||
WHERE created_at >= DATE_SUB(NOW(), INTERVAL %s DAY)
|
||||
"""
|
||||
params = [days]
|
||||
|
||||
if task_type:
|
||||
query += " AND task_type = %s"
|
||||
params.append(task_type)
|
||||
|
||||
query += " ORDER BY created_at DESC"
|
||||
|
||||
cursor.execute(query, params)
|
||||
tasks = cursor.fetchall()
|
||||
cursor.close()
|
||||
|
||||
return tasks
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"获取任务状态失败: {e}")
|
||||
return []
|
||||
|
||||
def run_manual_task(self, task_name: str, **kwargs) -> Dict:
|
||||
"""手动执行任务"""
|
||||
try:
|
||||
self.logger.info(f"手动执行任务: {task_name}")
|
||||
|
||||
task_map = {
|
||||
'update_stock_list': self._update_stock_list,
|
||||
'update_daily_kline': lambda: self._update_daily_kline(),
|
||||
'update_watchlist': self._update_watchlist_data,
|
||||
'update_market_stats': self._update_market_statistics,
|
||||
'clean_old_data': self._clean_old_data
|
||||
}
|
||||
|
||||
if task_name not in task_map:
|
||||
return {'error': f'未知任务: {task_name}'}
|
||||
|
||||
task_func = task_map[task_name]
|
||||
return task_func()
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"手动执行任务失败: {task_name}, 错误: {e}")
|
||||
return {'error': str(e)}
|
||||
|
||||
|
||||
# 全局调度器实例
|
||||
task_scheduler = TaskScheduler()
|
||||
|
||||
|
||||
def start_scheduler():
|
||||
"""启动任务调度器"""
|
||||
task_scheduler.start()
|
||||
|
||||
|
||||
def stop_scheduler():
|
||||
"""停止任务调度器"""
|
||||
task_scheduler.stop()
|
||||
|
||||
|
||||
def get_scheduler_status(task_type: str = None, days: int = 7) -> List[Dict]:
|
||||
"""获取调度器状态"""
|
||||
return task_scheduler.get_task_status(task_type, days)
|
||||
|
||||
|
||||
def run_manual_task(task_name: str, **kwargs) -> Dict:
|
||||
"""手动执行任务"""
|
||||
return task_scheduler.run_manual_task(task_name, **kwargs)
|
||||
@ -1,97 +1,29 @@
|
||||
"""
|
||||
基于数据库的AI分析服务
|
||||
"""
|
||||
import json
|
||||
import os
|
||||
from datetime import datetime, date
|
||||
from openai import OpenAI
|
||||
from app.dao import AIAnalysisDAO, ConfigDAO
|
||||
from app.config import Config
|
||||
import logging
|
||||
|
||||
class AIAnalysisService:
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class AIAnalysisServiceDB:
|
||||
def __init__(self):
|
||||
# 配置OpenAI客户端连接到Volces API
|
||||
self.model = "ep-20251113170010-6qdcp" # Volces 模型接入点ID
|
||||
self.client = OpenAI(
|
||||
api_key = "ec3ebae6-e131-4b1e-a5ae-30f70468e165", # 豆包大模型APIkey
|
||||
base_url = "https://ark.cn-beijing.volces.com/api/v3"
|
||||
api_key="ec3ebae6-e131-4b1e-a5ae-30f70468e165", # 豆包大模型APIkey
|
||||
base_url="https://ark.cn-beijing.volces.com/api/v3"
|
||||
)
|
||||
# 创建AI分析结果缓存目录
|
||||
self.cache_dir = os.path.join(Config.BASE_DIR, "ai_stock_analysis")
|
||||
self.dao_cache_dir = os.path.join(Config.BASE_DIR, "dao_analysis")
|
||||
self.daka_cache_dir = os.path.join(Config.BASE_DIR, "daka_analysis")
|
||||
|
||||
# 确保所有缓存目录存在
|
||||
for directory in [self.cache_dir, self.dao_cache_dir, self.daka_cache_dir]:
|
||||
if not os.path.exists(directory):
|
||||
os.makedirs(directory)
|
||||
|
||||
def get_cache_path(self, stock_code: str) -> str:
|
||||
"""获取缓存文件路径"""
|
||||
return os.path.join(self.cache_dir, f"{stock_code}.json")
|
||||
|
||||
def get_dao_cache_path(self, stock_code: str) -> str:
|
||||
"""获取道德经分析缓存文件路径"""
|
||||
return os.path.join(self.dao_cache_dir, f"{stock_code}.json")
|
||||
|
||||
def get_daka_cache_path(self, stock_code: str) -> str:
|
||||
"""获取大咖分析缓存文件路径"""
|
||||
return os.path.join(self.daka_cache_dir, f"{stock_code}.json")
|
||||
|
||||
def load_cache(self, stock_code: str):
|
||||
"""加载缓存的AI分析结果"""
|
||||
cache_path = self.get_cache_path(stock_code)
|
||||
if os.path.exists(cache_path):
|
||||
try:
|
||||
with open(cache_path, 'r', encoding='utf-8') as f:
|
||||
return json.load(f)
|
||||
except Exception as e:
|
||||
print(f"读取AI分析缓存失败: {str(e)}")
|
||||
return None
|
||||
|
||||
def save_cache(self, stock_code: str, analysis_result: dict):
|
||||
"""保存AI分析结果到缓存"""
|
||||
cache_path = self.get_cache_path(stock_code)
|
||||
try:
|
||||
with open(cache_path, 'w', encoding='utf-8') as f:
|
||||
json.dump(analysis_result, f, ensure_ascii=False, indent=4)
|
||||
except Exception as e:
|
||||
print(f"保存AI分析缓存失败: {str(e)}")
|
||||
|
||||
def load_dao_cache(self, stock_code: str):
|
||||
"""加载缓存的道德经分析结果"""
|
||||
cache_path = self.get_dao_cache_path(stock_code)
|
||||
if os.path.exists(cache_path):
|
||||
try:
|
||||
with open(cache_path, 'r', encoding='utf-8') as f:
|
||||
return json.load(f)
|
||||
except Exception as e:
|
||||
print(f"读取道德经分析缓存失败: {str(e)}")
|
||||
return None
|
||||
|
||||
def save_dao_cache(self, stock_code: str, analysis_result: dict):
|
||||
"""保存道德经分析结果到缓存"""
|
||||
cache_path = self.get_dao_cache_path(stock_code)
|
||||
try:
|
||||
with open(cache_path, 'w', encoding='utf-8') as f:
|
||||
json.dump(analysis_result, f, ensure_ascii=False, indent=4)
|
||||
except Exception as e:
|
||||
print(f"保存道德经分析缓存失败: {str(e)}")
|
||||
|
||||
def load_daka_cache(self, stock_code: str):
|
||||
"""加载缓存的大咖分析结果"""
|
||||
cache_path = self.get_daka_cache_path(stock_code)
|
||||
if os.path.exists(cache_path):
|
||||
try:
|
||||
with open(cache_path, 'r', encoding='utf-8') as f:
|
||||
return json.load(f)
|
||||
except Exception as e:
|
||||
print(f"读取大咖分析缓存失败: {str(e)}")
|
||||
return None
|
||||
|
||||
def save_daka_cache(self, stock_code: str, analysis_result: dict):
|
||||
"""保存大咖分析结果到缓存"""
|
||||
cache_path = self.get_daka_cache_path(stock_code)
|
||||
try:
|
||||
with open(cache_path, 'w', encoding='utf-8') as f:
|
||||
json.dump(analysis_result, f, ensure_ascii=False, indent=4)
|
||||
except Exception as e:
|
||||
print(f"保存大咖分析缓存失败: {str(e)}")
|
||||
# 数据访问对象
|
||||
self.ai_dao = AIAnalysisDAO()
|
||||
self.config_dao = ConfigDAO()
|
||||
self.logger = logging.getLogger(__name__)
|
||||
|
||||
def analyze_value_investment(self, analysis_data: dict, force_refresh: bool = False):
|
||||
"""
|
||||
@ -102,23 +34,21 @@ class AIAnalysisService:
|
||||
"""
|
||||
try:
|
||||
stock_code = analysis_data["stock_info"]["code"]
|
||||
|
||||
# 如果不是强制刷新,尝试从缓存加载
|
||||
today = self.ai_dao.get_today_date()
|
||||
|
||||
# 如果不是强制刷新,尝试从数据库加载
|
||||
if not force_refresh:
|
||||
cached_result = self.load_cache(stock_code)
|
||||
cached_result = self.ai_dao.get_analysis(stock_code, 'stock', today)
|
||||
if cached_result:
|
||||
print(f"从缓存加载AI分析结果: {stock_code}")
|
||||
return cached_result
|
||||
logger.info(f"从数据库加载AI分析结果: {stock_code}")
|
||||
return self.ai_dao.format_analysis_data(cached_result)
|
||||
|
||||
# 打印输入数据用于调试
|
||||
print(f"输入的分析数据: {json.dumps(analysis_data, ensure_ascii=False, indent=2)}")
|
||||
|
||||
logger.info(f"开始AI价值投资分析: {stock_code}")
|
||||
|
||||
# 构建提示词
|
||||
prompt = self._build_analysis_prompt(analysis_data)
|
||||
|
||||
# 打印提示词用于调试
|
||||
print(f"AI分析提示词: {prompt}")
|
||||
|
||||
|
||||
# 调用API
|
||||
response = self.client.chat.completions.create(
|
||||
model=self.model,
|
||||
@ -134,23 +64,27 @@ class AIAnalysisService:
|
||||
}
|
||||
]
|
||||
)
|
||||
|
||||
|
||||
# 获取分析结果
|
||||
analysis_text = response.choices[0].message.content
|
||||
print(f"AI原始返回结果: {analysis_text}")
|
||||
|
||||
logger.info(f"AI分析完成: {stock_code}")
|
||||
|
||||
try:
|
||||
# 尝试解析JSON
|
||||
analysis_result = json.loads(analysis_text)
|
||||
print(f"解析后的JSON结果: {json.dumps(analysis_result, ensure_ascii=False, indent=2)}")
|
||||
|
||||
# 保存到缓存
|
||||
self.save_cache(stock_code, analysis_result)
|
||||
|
||||
|
||||
# 添加缓存标识
|
||||
analysis_result['from_cache'] = False
|
||||
|
||||
# 保存到数据库
|
||||
success = self.ai_dao.save_analysis(stock_code, 'stock', analysis_result, today)
|
||||
if not success:
|
||||
logger.warning(f"保存AI分析结果失败: {stock_code}")
|
||||
|
||||
return analysis_result
|
||||
|
||||
|
||||
except json.JSONDecodeError as e:
|
||||
print(f"JSON解析失败: {str(e)}")
|
||||
logger.error(f"JSON解析失败: {str(e)}")
|
||||
# 如果JSON解析失败,返回错误信息
|
||||
error_result = {
|
||||
'stock_info': analysis_data.get('stock_info', {}),
|
||||
@ -164,113 +98,172 @@ class AIAnalysisService:
|
||||
'analysis_result': {
|
||||
"error": "AI返回的结果不是有效的JSON格式",
|
||||
"raw_text": analysis_text
|
||||
}
|
||||
},
|
||||
'from_cache': False
|
||||
}
|
||||
return error_result
|
||||
|
||||
|
||||
except Exception as e:
|
||||
print(f"AI分析失败: {str(e)}")
|
||||
logger.error(f"AI分析失败: {str(e)}")
|
||||
return {"error": f"AI分析失败: {str(e)}"}
|
||||
|
||||
def _parse_analysis_result(self, analysis_text, current_price):
|
||||
|
||||
def analyze_tao_philosophy(self, company_info: dict, force_refresh: bool = False):
|
||||
"""
|
||||
解析AI返回的分析文本,提取结构化信息
|
||||
基于道德经理念分析公司
|
||||
:param company_info: 公司信息
|
||||
:param force_refresh: 是否强制刷新分析结果
|
||||
:return: AI分析结果
|
||||
"""
|
||||
try:
|
||||
print(f"开始解析分析文本...")
|
||||
|
||||
# 提取投资建议
|
||||
suggestion_pattern = r"投资建议[::]([\s\S]*?)(?=\n\n|$)"
|
||||
suggestion_match = re.search(suggestion_pattern, analysis_text, re.MULTILINE | re.DOTALL)
|
||||
investment_suggestion = suggestion_match.group(1).strip() if suggestion_match else ""
|
||||
print(f"提取到的投资建议: {investment_suggestion}")
|
||||
|
||||
# 提取合理价格区间
|
||||
price_pattern = r"合理股价区间[::]\s*(\d+\.?\d*)\s*[元-]\s*(\d+\.?\d*)[元]"
|
||||
price_match = re.search(price_pattern, analysis_text)
|
||||
if price_match:
|
||||
price_min = float(price_match.group(1))
|
||||
price_max = float(price_match.group(2))
|
||||
else:
|
||||
price_min = current_price * 0.8
|
||||
price_max = current_price * 1.2
|
||||
print(f"提取到的价格区间: {price_min}-{price_max}")
|
||||
|
||||
# 提取目标市值区间(单位:亿元)
|
||||
market_value_pattern = r"目标市值区间[::]\s*(\d+\.?\d*)\s*[亿-]\s*(\d+\.?\d*)[亿]"
|
||||
market_value_match = re.search(market_value_pattern, analysis_text)
|
||||
if market_value_match:
|
||||
market_value_min = float(market_value_match.group(1))
|
||||
market_value_max = float(market_value_match.group(2))
|
||||
else:
|
||||
# 尝试从文本中提取计算得出的市值
|
||||
calc_pattern = r"最低市值[=≈约]*(\d+\.?\d*)[亿].*最高市值[=≈约]*(\d+\.?\d*)[亿]"
|
||||
calc_match = re.search(calc_pattern, analysis_text)
|
||||
if calc_match:
|
||||
market_value_min = float(calc_match.group(1))
|
||||
market_value_max = float(calc_match.group(2))
|
||||
else:
|
||||
market_value_min = 0
|
||||
market_value_max = 0
|
||||
print(f"提取到的市值区间: {market_value_min}-{market_value_max}")
|
||||
|
||||
# 提取各个分析维度的内容
|
||||
analysis_patterns = {
|
||||
"valuation_analysis": r"估值分析([\s\S]*?)(?=###\s*财务状况分析|###\s*成长性分析|$)",
|
||||
"financial_health": r"财务状况分析([\s\S]*?)(?=###\s*成长性分析|###\s*风险评估|$)",
|
||||
"growth_potential": r"成长性分析([\s\S]*?)(?=###\s*风险评估|###\s*投资建议|$)",
|
||||
"risk_assessment": r"风险评估([\s\S]*?)(?=###\s*投资建议|$)"
|
||||
}
|
||||
|
||||
analysis_results = {}
|
||||
for key, pattern in analysis_patterns.items():
|
||||
match = re.search(pattern, analysis_text, re.MULTILINE | re.DOTALL)
|
||||
content = match.group(1).strip() if match else ""
|
||||
# 移除markdown标记和多余的空白字符
|
||||
content = re.sub(r'[#\-*]', '', content).strip()
|
||||
analysis_results[key] = content
|
||||
print(f"提取到的{key}: {content[:100]}...")
|
||||
|
||||
return {
|
||||
"investment_suggestion": investment_suggestion,
|
||||
"analysis": analysis_results,
|
||||
"price_analysis": {
|
||||
"reasonable_price_range": {
|
||||
"min": price_min,
|
||||
"max": price_max
|
||||
},
|
||||
"target_market_value": {
|
||||
"min": market_value_min,
|
||||
"max": market_value_max
|
||||
stock_code = company_info.get('basic_info', {}).get('code')
|
||||
today = self.ai_dao.get_today_date()
|
||||
|
||||
# 如果不是强制刷新,尝试从数据库加载
|
||||
if not force_refresh and stock_code:
|
||||
cached_result = self.ai_dao.get_analysis(stock_code, 'dao', today)
|
||||
if cached_result:
|
||||
logger.info(f"从数据库加载道德经分析结果: {stock_code}")
|
||||
return self.ai_dao.format_analysis_data(cached_result)
|
||||
|
||||
# 构建提示词
|
||||
prompt = self._build_tao_analysis_prompt(company_info)
|
||||
|
||||
# 调用API
|
||||
response = self.client.chat.completions.create(
|
||||
model=self.model,
|
||||
messages=[
|
||||
{
|
||||
"role": "user",
|
||||
"content": prompt
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
]
|
||||
)
|
||||
|
||||
# 获取分析结果
|
||||
analysis_text = response.choices[0].message.content
|
||||
logger.info(f"道德经分析完成: {stock_code}")
|
||||
|
||||
try:
|
||||
# 解析JSON结果
|
||||
analysis_result = json.loads(analysis_text)
|
||||
|
||||
# 添加缓存标识
|
||||
analysis_result['from_cache'] = False
|
||||
|
||||
# 保存到数据库
|
||||
if stock_code:
|
||||
success = self.ai_dao.save_analysis(stock_code, 'dao', analysis_result, today)
|
||||
if not success:
|
||||
logger.warning(f"保存道德经分析结果失败: {stock_code}")
|
||||
|
||||
return analysis_result
|
||||
except json.JSONDecodeError as e:
|
||||
logger.error(f"道德经分析结果JSON解析失败: {str(e)}")
|
||||
return {"error": "分析结果格式错误", "from_cache": False}
|
||||
|
||||
except Exception as e:
|
||||
print(f"解析分析结果失败: {str(e)}")
|
||||
print(f"错误详情: {e.__class__.__name__}")
|
||||
import traceback
|
||||
print(f"错误堆栈: {traceback.format_exc()}")
|
||||
return {
|
||||
"investment_suggestion": "分析结果解析失败",
|
||||
"analysis": {
|
||||
"valuation_analysis": "解析失败",
|
||||
"financial_health": "解析失败",
|
||||
"growth_potential": "解析失败",
|
||||
"risk_assessment": "解析失败"
|
||||
},
|
||||
"price_analysis": {
|
||||
"reasonable_price_range": {
|
||||
"min": current_price * 0.8,
|
||||
"max": current_price * 1.2
|
||||
},
|
||||
"target_market_value": {
|
||||
"min": 0,
|
||||
"max": 0
|
||||
logger.error(f"道德经分析失败: {str(e)}")
|
||||
return {"error": f"道德经分析失败: {str(e)}", "from_cache": False}
|
||||
|
||||
def analyze_by_masters(self, company_info: dict, value_analysis: dict, force_refresh: bool = False):
|
||||
"""
|
||||
基于各位价值投资大咖的理念分析公司
|
||||
:param company_info: 公司信息
|
||||
:param value_analysis: 价值分析数据
|
||||
:param force_refresh: 是否强制刷新分析结果
|
||||
:return: AI分析结果
|
||||
"""
|
||||
try:
|
||||
stock_code = company_info.get('basic_info', {}).get('code')
|
||||
today = self.ai_dao.get_today_date()
|
||||
|
||||
# 如果不是强制刷新,尝试从数据库加载
|
||||
if not force_refresh and stock_code:
|
||||
cached_result = self.ai_dao.get_analysis(stock_code, 'daka', today)
|
||||
if cached_result:
|
||||
logger.info(f"从数据库加载大咖分析结果: {stock_code}")
|
||||
return self.ai_dao.format_analysis_data(cached_result)
|
||||
|
||||
logger.info(f"开始大咖分析: {stock_code}")
|
||||
|
||||
# 构建提示词
|
||||
prompt = self._build_masters_analysis_prompt(company_info, value_analysis)
|
||||
|
||||
# 调用API
|
||||
response = self.client.chat.completions.create(
|
||||
model=self.model,
|
||||
messages=[
|
||||
{
|
||||
"role": "user",
|
||||
"content": prompt
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
]
|
||||
)
|
||||
|
||||
# 获取分析结果
|
||||
analysis_text = response.choices[0].message.content
|
||||
logger.info(f"大咖分析完成: {stock_code}")
|
||||
|
||||
try:
|
||||
# 解析JSON结果
|
||||
analysis_result = json.loads(analysis_text)
|
||||
|
||||
# 添加缓存标识
|
||||
analysis_result['from_cache'] = False
|
||||
|
||||
# 保存到数据库
|
||||
if stock_code:
|
||||
success = self.ai_dao.save_analysis(stock_code, 'daka', analysis_result, today)
|
||||
if not success:
|
||||
logger.warning(f"保存大咖分析结果失败: {stock_code}")
|
||||
|
||||
return analysis_result
|
||||
except json.JSONDecodeError as e:
|
||||
logger.error(f"大咖分析结果JSON解析失败: {str(e)}")
|
||||
return {"error": "分析结果格式错误", "from_cache": False}
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"价值投资大咖分析失败: {str(e)}")
|
||||
return {"error": f"价值投资大咖分析失败: {str(e)}", "from_cache": False}
|
||||
|
||||
def get_analysis_history(self, stock_code: str, analysis_type: str, days: int = 30):
|
||||
"""获取分析历史"""
|
||||
try:
|
||||
return self.ai_dao.get_analysis_history(stock_code, analysis_type, days)
|
||||
except Exception as e:
|
||||
logger.error(f"获取分析历史失败: {stock_code}, {analysis_type}, 错误: {e}")
|
||||
return []
|
||||
|
||||
def get_latest_analysis(self, stock_code: str, analysis_type: str):
|
||||
"""获取最新的分析结果"""
|
||||
try:
|
||||
latest = self.ai_dao.get_latest_analysis(stock_code, analysis_type)
|
||||
if latest:
|
||||
return self.ai_dao.format_analysis_data(latest)
|
||||
return None
|
||||
except Exception as e:
|
||||
logger.error(f"获取最新分析失败: {stock_code}, {analysis_type}, 错误: {e}")
|
||||
return None
|
||||
|
||||
def get_all_analysis_types(self, stock_code: str, analysis_date: str = None):
|
||||
"""获取股票的所有类型分析"""
|
||||
try:
|
||||
if analysis_date is None:
|
||||
analysis_date = self.ai_dao.get_today_date()
|
||||
|
||||
records = self.ai_dao.get_all_analysis_types(stock_code, analysis_date)
|
||||
results = {}
|
||||
|
||||
for record in records:
|
||||
analysis_type = record['analysis_type']
|
||||
results[analysis_type] = self.ai_dao.format_analysis_data(record)
|
||||
|
||||
return results
|
||||
except Exception as e:
|
||||
logger.error(f"获取所有分析类型失败: {stock_code}, 错误: {e}")
|
||||
return {}
|
||||
|
||||
# 复用原有的提示词构建方法
|
||||
def _build_analysis_prompt(self, data):
|
||||
"""
|
||||
构建AI分析提示词
|
||||
@ -283,7 +276,7 @@ class AIAnalysisService:
|
||||
solvency = data.get('solvency', {})
|
||||
cash_flow = data.get('cash_flow', {})
|
||||
per_share = data.get('per_share', {})
|
||||
|
||||
|
||||
# 格式化数值,保留4位小数
|
||||
def format_number(value):
|
||||
try:
|
||||
@ -304,7 +297,7 @@ class AIAnalysisService:
|
||||
return str(value)
|
||||
except:
|
||||
return "0.0000"
|
||||
|
||||
|
||||
# 格式化百分比,保留2位小数
|
||||
def format_percent(value):
|
||||
try:
|
||||
@ -403,66 +396,15 @@ class AIAnalysisService:
|
||||
|
||||
# 组合完整的提示词
|
||||
prompt = data_section + analysis_requirements
|
||||
|
||||
return prompt
|
||||
|
||||
def analyze_tao_philosophy(self, company_info: dict, force_refresh: bool = False):
|
||||
"""
|
||||
基于道德经理念分析公司
|
||||
:param company_info: 公司信息
|
||||
:param force_refresh: 是否强制刷新分析结果
|
||||
:return: AI分析结果
|
||||
"""
|
||||
try:
|
||||
stock_code = company_info.get('basic_info', {}).get('code')
|
||||
|
||||
# 如果不是强制刷新,尝试从缓存加载
|
||||
if not force_refresh and stock_code:
|
||||
cached_result = self.load_dao_cache(stock_code)
|
||||
if cached_result:
|
||||
print(f"从缓存加载道德经分析结果: {stock_code}")
|
||||
return cached_result
|
||||
|
||||
# 构建提示词
|
||||
prompt = self._build_tao_analysis_prompt(company_info)
|
||||
|
||||
# 调用API
|
||||
response = self.client.chat.completions.create(
|
||||
model=self.model,
|
||||
messages=[
|
||||
{
|
||||
"role": "user",
|
||||
"content": prompt
|
||||
}
|
||||
]
|
||||
)
|
||||
|
||||
# 获取分析结果
|
||||
analysis_text = response.choices[0].message.content
|
||||
|
||||
try:
|
||||
# 解析JSON结果
|
||||
analysis_result = json.loads(analysis_text)
|
||||
|
||||
# 保存到缓存
|
||||
if stock_code:
|
||||
self.save_dao_cache(stock_code, analysis_result)
|
||||
|
||||
return analysis_result
|
||||
except json.JSONDecodeError as e:
|
||||
print(f"道德经分析结果JSON解析失败: {str(e)}")
|
||||
return {"error": "分析结果格式错误"}
|
||||
|
||||
except Exception as e:
|
||||
print(f"道德经分析失败: {str(e)}")
|
||||
return {"error": f"道德经分析失败: {str(e)}"}
|
||||
|
||||
return prompt
|
||||
|
||||
def _build_tao_analysis_prompt(self, company_info: dict):
|
||||
"""
|
||||
构建道德经分析提示词
|
||||
"""
|
||||
basic_info = company_info.get('basic_info', {})
|
||||
|
||||
|
||||
prompt = f"""请作为一位精通道德经的智者,运用道德经的智慧来分析{basic_info.get('name', '')}({basic_info.get('code', '')})这家公司。
|
||||
|
||||
公司基本信息:
|
||||
@ -493,81 +435,20 @@ class AIAnalysisService:
|
||||
- 持有建议
|
||||
|
||||
请以JSON格式返回分析结果,包含以下字段:
|
||||
1. tao_philosophy: 道德经视角的分析
|
||||
2. business_ethics: 企业道德评估
|
||||
3. investment_advice: 投资建议
|
||||
1. investment_suggestion: 投资建议(summary, action, key_points)
|
||||
2. analysis: 详细分析(道德经视角, 企业道德评估, 风险评估)
|
||||
3. price_analysis: 价格分析(合理价格区间, 目标市值区间)
|
||||
|
||||
分析要客观、专业、深入,同时体现道德经的智慧。"""
|
||||
|
||||
return prompt
|
||||
|
||||
def analyze_by_masters(self, company_info: dict, value_analysis: dict, force_refresh: bool = False):
|
||||
"""
|
||||
基于各位价值投资大咖的理念分析公司
|
||||
:param company_info: 公司信息
|
||||
:param value_analysis: 价值分析数据
|
||||
:param force_refresh: 是否强制刷新分析结果
|
||||
:return: AI分析结果
|
||||
"""
|
||||
try:
|
||||
stock_code = company_info.get('basic_info', {}).get('code')
|
||||
|
||||
# 如果不是强制刷新,尝试从缓存加载
|
||||
if not force_refresh and stock_code:
|
||||
cached_result = self.load_daka_cache(stock_code)
|
||||
if cached_result:
|
||||
print(f"从缓存加载大咖分析结果: {stock_code}")
|
||||
return cached_result
|
||||
|
||||
# 打印输入数据用于调试
|
||||
print(f"公司信息: {json.dumps(company_info, ensure_ascii=False, indent=2)}")
|
||||
print(f"价值分析数据: {json.dumps(value_analysis, ensure_ascii=False, indent=2)}")
|
||||
|
||||
# 构建提示词
|
||||
prompt = self._build_masters_analysis_prompt(company_info, value_analysis)
|
||||
|
||||
# 打印提示词用于调试
|
||||
print(f"大咖分析提示词: {prompt}")
|
||||
|
||||
# 调用API
|
||||
response = self.client.chat.completions.create(
|
||||
model=self.model,
|
||||
messages=[
|
||||
{
|
||||
"role": "user",
|
||||
"content": prompt
|
||||
}
|
||||
]
|
||||
)
|
||||
|
||||
# 获取分析结果
|
||||
analysis_text = response.choices[0].message.content
|
||||
print(f"AI原始返回结果: {analysis_text}")
|
||||
|
||||
try:
|
||||
# 解析JSON结果
|
||||
analysis_result = json.loads(analysis_text)
|
||||
print(f"解析后的JSON结果: {json.dumps(analysis_result, ensure_ascii=False, indent=2)}")
|
||||
|
||||
# 保存到缓存
|
||||
if stock_code:
|
||||
self.save_daka_cache(stock_code, analysis_result)
|
||||
|
||||
return analysis_result
|
||||
except json.JSONDecodeError as e:
|
||||
print(f"大咖分析结果JSON解析失败: {str(e)}")
|
||||
return {"error": "分析结果格式错误"}
|
||||
|
||||
except Exception as e:
|
||||
print(f"价值投资大咖分析失败: {str(e)}")
|
||||
return {"error": f"价值投资大咖分析失败: {str(e)}"}
|
||||
|
||||
return prompt
|
||||
|
||||
def _build_masters_analysis_prompt(self, company_info: dict, value_analysis: dict):
|
||||
"""
|
||||
构建价值投资大咖分析提示词
|
||||
"""
|
||||
basic_info = company_info.get('basic_info', {})
|
||||
|
||||
|
||||
# 从value_analysis中获取财务数据
|
||||
valuation = value_analysis.get('valuation', {})
|
||||
profitability = value_analysis.get('profitability', {})
|
||||
@ -577,7 +458,7 @@ class AIAnalysisService:
|
||||
cash_flow = value_analysis.get('cash_flow', {})
|
||||
per_share = value_analysis.get('per_share', {})
|
||||
stock_info = value_analysis.get('stock_info', {})
|
||||
|
||||
|
||||
# 格式化百分比
|
||||
def format_percent(value):
|
||||
if value is None:
|
||||
@ -590,7 +471,7 @@ class AIAnalysisService:
|
||||
return f"{value:.2f}%"
|
||||
except:
|
||||
return '-'
|
||||
|
||||
|
||||
# 格式化数字
|
||||
def format_number(value):
|
||||
if value is None:
|
||||
@ -601,7 +482,7 @@ class AIAnalysisService:
|
||||
return f"{value:.4f}"
|
||||
except:
|
||||
return '-'
|
||||
|
||||
|
||||
prompt = f"""请分别以五位价值投资大咖的视角,分析{basic_info.get('name', '')}({basic_info.get('code', '')})这家公司。
|
||||
|
||||
公司基本信息:
|
||||
@ -620,9 +501,6 @@ class AIAnalysisService:
|
||||
当前市场信息:
|
||||
- 当前股价:{format_number(stock_info.get('current_price'))}元
|
||||
- 总市值:{format_number(valuation.get('total_market_value'))}亿元
|
||||
- 流通市值:{format_number(valuation.get('circulating_market_value'))}亿元
|
||||
- 流通比例:{format_percent(valuation.get('circulating_ratio'))}
|
||||
- 换手率:{format_percent(stock_info.get('turnover_ratio'))}
|
||||
|
||||
估值指标:
|
||||
- 市盈率(PE):{format_number(valuation.get('pe_ratio'))}
|
||||
@ -632,42 +510,20 @@ class AIAnalysisService:
|
||||
|
||||
盈利能力指标:
|
||||
- ROE:{format_percent(profitability.get('roe'))}
|
||||
- ROE(扣非):{format_percent(profitability.get('deducted_roe'))}
|
||||
- ROA:{format_percent(profitability.get('roa'))}
|
||||
- 毛利率:{format_percent(profitability.get('gross_margin'))}
|
||||
- 净利率:{format_percent(profitability.get('net_margin'))}
|
||||
|
||||
成长能力指标:
|
||||
- 净利润增长率:{format_percent(growth.get('net_profit_growth'))}
|
||||
- 扣非净利润增长率:{format_percent(growth.get('deducted_net_profit_growth'))}
|
||||
- 营业总收入增长率:{format_percent(growth.get('revenue_growth'))}
|
||||
- 营业收入增长率:{format_percent(growth.get('operating_revenue_growth'))}
|
||||
|
||||
运营能力指标:
|
||||
- 总资产周转率:{format_number(operation.get('asset_turnover'))}
|
||||
- 存货周转率:{format_number(operation.get('inventory_turnover'))}
|
||||
- 应收账款周转率:{format_number(operation.get('receivables_turnover'))}
|
||||
- 流动资产周转率:{format_number(operation.get('current_asset_turnover'))}
|
||||
- 营收增长率:{format_percent(growth.get('revenue_growth'))}
|
||||
|
||||
偿债能力指标:
|
||||
- 流动比率:{format_number(solvency.get('current_ratio'))}
|
||||
- 速动比率:{format_number(solvency.get('quick_ratio'))}
|
||||
- 资产负债率:{format_percent(solvency.get('debt_to_assets'))}
|
||||
- 产权比率:{format_number(solvency.get('equity_ratio'))}
|
||||
|
||||
现金流指标:
|
||||
- 经营现金流/营收:{format_percent(cash_flow.get('ocf_to_revenue'))}
|
||||
- 经营现金流/经营利润:{format_percent(cash_flow.get('ocf_to_operating_profit'))}
|
||||
- 经营现金流同比增长:{format_percent(cash_flow.get('ocf_growth'))}
|
||||
|
||||
每股指标:
|
||||
- 每股收益(EPS):{format_number(per_share.get('eps'))}元
|
||||
- 每股收益(扣非):{format_number(per_share.get('deducted_eps'))}元
|
||||
- 每股净资产:{format_number(per_share.get('bps'))}元
|
||||
- 每股经营现金流:{format_number(per_share.get('ocfps'))}元
|
||||
- 每股留存收益:{format_number(per_share.get('retained_eps'))}元
|
||||
- 每股现金流量:{format_number(per_share.get('cfps'))}元
|
||||
- 每股息税前利润:{format_number(per_share.get('ebit_ps'))}元
|
||||
|
||||
请分别从以下五位投资大师的视角进行分析:
|
||||
|
||||
@ -708,12 +564,10 @@ class AIAnalysisService:
|
||||
- 是否值得长期持有(投资价值判断)
|
||||
|
||||
请以JSON格式返回分析结果,包含以下字段:
|
||||
1. buffett_analysis: 巴菲特的分析观点
|
||||
2. graham_analysis: 格雷厄姆的分析观点
|
||||
3. lin_yuan_analysis: 林园的分析观点
|
||||
4. li_daxiao_analysis: 李大霄的分析观点
|
||||
5. duan_yongping_analysis: 段永平的分析观点
|
||||
1. investment_suggestion: 投资建议(summary, action, key_points)
|
||||
2. analysis: 详细分析(巴菲特视角, 格雷厄姆视角, 林园视角, 李大霄视角, 段永平视角)
|
||||
3. price_analysis: 价格分析(合理价格区间, 目标市值区间)
|
||||
|
||||
分析要客观、专业、深入,并体现每位投资大师的独特投资理念。请基于上述详细的财务数据进行分析(如果指标缺失或异常,请联网获取),尤其是定量指标的解读。"""
|
||||
|
||||
return prompt
|
||||
分析要客观、专业、深入,并体现每位投资大师的独特投资理念。请基于上述详细的财务数据进行分析,尤其是定量指标的解读。"""
|
||||
|
||||
return prompt
|
||||
466
app/services/kline_service.py
Normal file
466
app/services/kline_service.py
Normal file
@ -0,0 +1,466 @@
|
||||
"""
|
||||
K线数据服务
|
||||
获取和管理股票的K线数据(日K、周K、月K)
|
||||
"""
|
||||
import pandas as pd
|
||||
import logging
|
||||
from datetime import datetime, date, timedelta
|
||||
from typing import List, Dict, Optional, Tuple
|
||||
from app import pro
|
||||
from app.database import DatabaseManager
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class KlineService:
|
||||
def __init__(self):
|
||||
self.db_manager = DatabaseManager()
|
||||
self.logger = logging.getLogger(__name__)
|
||||
|
||||
def get_kline_data(self, stock_code: str, kline_type: str = 'daily',
|
||||
start_date: str = None, end_date: str = None,
|
||||
limit: int = 100) -> List[Dict]:
|
||||
"""获取K线数据
|
||||
|
||||
Args:
|
||||
stock_code: 股票代码
|
||||
kline_type: K线类型 (daily/weekly/monthly)
|
||||
start_date: 开始日期 (YYYYMMDD)
|
||||
end_date: 结束日期 (YYYYMMDD)
|
||||
limit: 返回数据条数限制
|
||||
|
||||
Returns:
|
||||
K线数据列表
|
||||
"""
|
||||
try:
|
||||
# 优先从数据库获取
|
||||
kline_data = self._get_kline_from_db(stock_code, kline_type, start_date, end_date, limit)
|
||||
if kline_data:
|
||||
return kline_data
|
||||
|
||||
# 从API获取数据
|
||||
self.logger.info(f"从API获取 {stock_code} 的{self._get_kline_name(kline_type)}数据")
|
||||
api_data = self._fetch_kline_from_api(stock_code, kline_type, start_date, end_date, limit)
|
||||
|
||||
# 保存到数据库
|
||||
if api_data:
|
||||
self._save_kline_to_db(api_data, kline_type)
|
||||
return api_data
|
||||
|
||||
return []
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"获取K线数据失败: {stock_code}, {kline_type}, 错误: {e}")
|
||||
return []
|
||||
|
||||
def _get_kline_from_db(self, stock_code: str, kline_type: str,
|
||||
start_date: str = None, end_date: str = None,
|
||||
limit: int = 100) -> List[Dict]:
|
||||
"""从数据库获取K线数据"""
|
||||
try:
|
||||
with self.db_manager.get_connection() as conn:
|
||||
cursor = conn.cursor(dictionary=True)
|
||||
|
||||
# 构建查询条件
|
||||
conditions = ["stock_code = %s", "kline_type = %s"]
|
||||
params = [stock_code, kline_type]
|
||||
|
||||
if start_date:
|
||||
conditions.append("trade_date >= %s")
|
||||
params.append(start_date)
|
||||
|
||||
if end_date:
|
||||
conditions.append("trade_date <= %s")
|
||||
params.append(end_date)
|
||||
|
||||
query = f"""
|
||||
SELECT * FROM kline_data
|
||||
WHERE {' AND '.join(conditions)}
|
||||
ORDER BY trade_date DESC
|
||||
LIMIT %s
|
||||
"""
|
||||
params.append(limit)
|
||||
|
||||
cursor.execute(query, params)
|
||||
klines = cursor.fetchall()
|
||||
|
||||
# 转换日期格式并处理数据类型
|
||||
result = []
|
||||
for kline in klines:
|
||||
result.append({
|
||||
'date': kline['trade_date'].strftime('%Y-%m-%d'),
|
||||
'open': float(kline['open_price']),
|
||||
'high': float(kline['high_price']),
|
||||
'low': float(kline['low_price']),
|
||||
'close': float(kline['close_price']),
|
||||
'volume': int(kline['volume']),
|
||||
'amount': float(kline['amount']),
|
||||
'change_percent': float(kline['change_percent']) if kline['change_percent'] else None,
|
||||
'turnover_rate': float(kline['turnover_rate']) if kline['turnover_rate'] else None,
|
||||
'pe_ratio': float(kline['pe_ratio']) if kline['pe_ratio'] else None,
|
||||
'pb_ratio': float(kline['pb_ratio']) if kline['pb_ratio'] else None
|
||||
})
|
||||
|
||||
cursor.close()
|
||||
return result
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"从数据库获取K线数据失败: {e}")
|
||||
return []
|
||||
|
||||
def _fetch_kline_from_api(self, stock_code: str, kline_type: str,
|
||||
start_date: str = None, end_date: str = None,
|
||||
limit: int = 100) -> List[Dict]:
|
||||
"""从tushare API获取K线数据"""
|
||||
try:
|
||||
# 确定ts_code格式
|
||||
if stock_code.startswith('6'):
|
||||
ts_code = f"{stock_code}.SH"
|
||||
elif stock_code.startswith(('0', '3')):
|
||||
ts_code = f"{stock_code}.SZ"
|
||||
elif stock_code.startswith('68'):
|
||||
ts_code = f"{stock_code}.SH"
|
||||
else:
|
||||
self.logger.error(f"不支持的股票代码: {stock_code}")
|
||||
return []
|
||||
|
||||
# 根据K线类型选择API接口
|
||||
if kline_type == 'daily':
|
||||
df = self._fetch_daily_data(ts_code, start_date, end_date, limit)
|
||||
elif kline_type == 'weekly':
|
||||
df = self._fetch_weekly_data(ts_code, start_date, end_date, limit)
|
||||
elif kline_type == 'monthly':
|
||||
df = self._fetch_monthly_data(ts_code, start_date, end_date, limit)
|
||||
else:
|
||||
self.logger.error(f"不支持的K线类型: {kline_type}")
|
||||
return []
|
||||
|
||||
if df is None or df.empty:
|
||||
self.logger.warning(f"未获取到 {stock_code} 的{self._get_kline_name(kline_type)}数据")
|
||||
return []
|
||||
|
||||
# 转换为标准格式
|
||||
result = []
|
||||
for _, row in df.iterrows():
|
||||
try:
|
||||
kline_data = {
|
||||
'stock_code': stock_code,
|
||||
'trade_date': pd.to_datetime(row['trade_date']).date(),
|
||||
'open_price': float(row['open']),
|
||||
'high_price': float(row['high']),
|
||||
'low_price': float(row['low']),
|
||||
'close_price': float(row['close']),
|
||||
'volume': int(row['vol']) if pd.notna(row.get('vol')) else 0,
|
||||
'amount': float(row.get('amount', 0)) / 10000 if pd.notna(row.get('amount')) else 0, # 转换为万元
|
||||
'change_percent': float(row['pct_chg']) / 100 if pd.notna(row.get('pct_chg')) else 0, # 转换为小数
|
||||
'change_amount': float(row.get('change', 0)) if pd.notna(row.get('change')) else 0,
|
||||
}
|
||||
|
||||
# 获取额外的估值指标
|
||||
self._add_valuation_data(kline_data, ts_code, row['trade_date'])
|
||||
|
||||
result.append(kline_data)
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"处理K线数据行失败: {e}")
|
||||
continue
|
||||
|
||||
self.logger.info(f"从API获取到 {len(result)} 条{self._get_kline_name(kline_type)}数据")
|
||||
return result
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"从API获取K线数据失败: {stock_code}, {kline_type}, 错误: {e}")
|
||||
return []
|
||||
|
||||
def _fetch_daily_data(self, ts_code: str, start_date: str = None,
|
||||
end_date: str = None, limit: int = 100) -> pd.DataFrame:
|
||||
"""获取日线数据"""
|
||||
try:
|
||||
return pro.daily(
|
||||
ts_code=ts_code,
|
||||
start_date=start_date,
|
||||
end_date=end_date,
|
||||
limit=limit
|
||||
)
|
||||
except Exception as e:
|
||||
self.logger.error(f"获取日线数据失败: {ts_code}, 错误: {e}")
|
||||
return pd.DataFrame()
|
||||
|
||||
def _fetch_weekly_data(self, ts_code: str, start_date: str = None,
|
||||
end_date: str = None, limit: int = 100) -> pd.DataFrame:
|
||||
"""获取周线数据"""
|
||||
try:
|
||||
return pro.weekly(
|
||||
ts_code=ts_code,
|
||||
start_date=start_date,
|
||||
end_date=end_date,
|
||||
limit=limit
|
||||
)
|
||||
except Exception as e:
|
||||
self.logger.error(f"获取周线数据失败: {ts_code}, 错误: {e}")
|
||||
return pd.DataFrame()
|
||||
|
||||
def _fetch_monthly_data(self, ts_code: str, start_date: str = None,
|
||||
end_date: str = None, limit: int = 100) -> pd.DataFrame:
|
||||
"""获取月线数据"""
|
||||
try:
|
||||
return pro.monthly(
|
||||
ts_code=ts_code,
|
||||
start_date=start_date,
|
||||
end_date=end_date,
|
||||
limit=limit
|
||||
)
|
||||
except Exception as e:
|
||||
self.logger.error(f"获取月线数据失败: {ts_code}, 错误: {e}")
|
||||
return pd.DataFrame()
|
||||
|
||||
def _add_valuation_data(self, kline_data: Dict, ts_code: str, trade_date: str):
|
||||
"""添加估值数据"""
|
||||
try:
|
||||
# 获取当日的基本数据
|
||||
daily_basic = pro.daily_basic(
|
||||
ts_code=ts_code,
|
||||
trade_date=trade_date,
|
||||
fields='ts_code,trade_date,pe,pb,dv_ratio,turnover_rate'
|
||||
)
|
||||
|
||||
if not daily_basic.empty:
|
||||
row = daily_basic.iloc[0]
|
||||
kline_data['pe_ratio'] = float(row['pe']) if pd.notna(row['pe']) else None
|
||||
kline_data['pb_ratio'] = float(row['pb']) if pd.notna(row['pb']) else None
|
||||
kline_data['turnover_rate'] = float(row['turnover_rate']) if pd.notna(row['turnover_rate']) else None
|
||||
kline_data['dividend_yield'] = float(row['dv_ratio']) / 100 if pd.notna(row['dv_ratio']) else 0
|
||||
|
||||
except Exception as e:
|
||||
# 估值数据获取失败不影响主要数据
|
||||
pass
|
||||
|
||||
def _save_kline_to_db(self, kline_data_list: List[Dict], kline_type: str):
|
||||
"""保存K线数据到数据库"""
|
||||
try:
|
||||
if not kline_data_list:
|
||||
return
|
||||
|
||||
with self.db_manager.get_connection() as conn:
|
||||
cursor = conn.cursor()
|
||||
|
||||
# 使用INSERT ... ON DUPLICATE KEY UPDATE批量保存
|
||||
query = """
|
||||
INSERT INTO kline_data (
|
||||
stock_code, kline_type, trade_date, open_price, high_price, low_price,
|
||||
close_price, volume, amount, change_percent, change_amount,
|
||||
turnover_rate, pe_ratio, pb_ratio, created_at
|
||||
) VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, NOW())
|
||||
ON DUPLICATE KEY UPDATE
|
||||
open_price = VALUES(open_price),
|
||||
high_price = VALUES(high_price),
|
||||
low_price = VALUES(low_price),
|
||||
close_price = VALUES(close_price),
|
||||
volume = VALUES(volume),
|
||||
amount = VALUES(amount),
|
||||
change_percent = VALUES(change_percent),
|
||||
change_amount = VALUES(change_amount),
|
||||
turnover_rate = VALUES(turnover_rate),
|
||||
pe_ratio = VALUES(pe_ratio),
|
||||
pb_ratio = VALUES(pb_ratio),
|
||||
updated_at = NOW()
|
||||
"""
|
||||
|
||||
# 批量插入数据
|
||||
batch_data = []
|
||||
for kline_data in kline_data_list:
|
||||
batch_data.append((
|
||||
kline_data['stock_code'],
|
||||
kline_type,
|
||||
kline_data['trade_date'],
|
||||
kline_data['open_price'],
|
||||
kline_data['high_price'],
|
||||
kline_data['low_price'],
|
||||
kline_data['close_price'],
|
||||
kline_data['volume'],
|
||||
kline_data['amount'],
|
||||
kline_data['change_percent'],
|
||||
kline_data['change_amount'],
|
||||
kline_data.get('turnover_rate'),
|
||||
kline_data.get('pe_ratio'),
|
||||
kline_data.get('pb_ratio')
|
||||
))
|
||||
|
||||
cursor.executemany(query, batch_data)
|
||||
conn.commit()
|
||||
cursor.close()
|
||||
|
||||
self.logger.info(f"成功保存 {len(kline_data_list)} 条{self._get_kline_name(kline_type)}数据")
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"保存K线数据到数据库失败: {e}")
|
||||
|
||||
def _get_kline_name(self, kline_type: str) -> str:
|
||||
"""获取K线类型的中文名称"""
|
||||
type_names = {
|
||||
'daily': '日K',
|
||||
'weekly': '周K',
|
||||
'monthly': '月K'
|
||||
}
|
||||
return type_names.get(kline_type, kline_type)
|
||||
|
||||
def batch_update_kline_data(self, stock_codes: List[str] = None,
|
||||
kline_type: str = 'daily',
|
||||
days_back: int = 30) -> Dict:
|
||||
"""批量更新K线数据
|
||||
|
||||
Args:
|
||||
stock_codes: 股票代码列表,None表示更新所有股票
|
||||
kline_type: K线类型
|
||||
days_back: 更新最近多少天的数据
|
||||
|
||||
Returns:
|
||||
更新结果统计
|
||||
"""
|
||||
try:
|
||||
if stock_codes is None:
|
||||
# 获取所有活跃股票
|
||||
with self.db_manager.get_connection() as conn:
|
||||
cursor = conn.cursor()
|
||||
cursor.execute("SELECT stock_code FROM stocks WHERE is_active = TRUE")
|
||||
stock_codes = [row[0] for row in cursor.fetchall()]
|
||||
cursor.close()
|
||||
|
||||
total_count = len(stock_codes)
|
||||
success_count = 0
|
||||
failed_count = 0
|
||||
|
||||
# 计算日期范围
|
||||
end_date = datetime.now().strftime('%Y%m%d')
|
||||
start_date = (datetime.now() - timedelta(days=days_back)).strftime('%Y%m%d')
|
||||
|
||||
self.logger.info(f"开始批量更新 {total_count} 只股票的{self._get_kline_name(kline_type)}数据")
|
||||
|
||||
for i, stock_code in enumerate(stock_codes):
|
||||
try:
|
||||
kline_data = self._fetch_kline_from_api(
|
||||
stock_code, kline_type, start_date, end_date, days_back
|
||||
)
|
||||
|
||||
if kline_data:
|
||||
self._save_kline_to_db(kline_data, kline_type)
|
||||
success_count += 1
|
||||
else:
|
||||
failed_count += 1
|
||||
|
||||
# 进度日志
|
||||
if (i + 1) % 50 == 0:
|
||||
self.logger.info(f"进度: {i + 1}/{total_count}, 成功: {success_count}, 失败: {failed_count}")
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"更新股票 {stock_code} 的K线数据失败: {e}")
|
||||
failed_count += 1
|
||||
continue
|
||||
|
||||
result = {
|
||||
'total': total_count,
|
||||
'success': success_count,
|
||||
'failed': failed_count,
|
||||
'kline_type': kline_type,
|
||||
'days_back': days_back
|
||||
}
|
||||
|
||||
self.logger.info(f"批量更新完成: {result}")
|
||||
return result
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"批量更新K线数据失败: {e}")
|
||||
return {'error': str(e)}
|
||||
|
||||
def get_market_overview(self, limit: int = 20) -> Dict:
|
||||
"""获取市场概览数据"""
|
||||
try:
|
||||
today = datetime.now().strftime('%Y-%m-%d')
|
||||
|
||||
with self.db_manager.get_connection() as conn:
|
||||
cursor = conn.cursor(dictionary=True)
|
||||
|
||||
# 获取涨跌统计
|
||||
query = """
|
||||
SELECT
|
||||
COUNT(*) as total_count,
|
||||
SUM(CASE WHEN change_percent > 0 THEN 1 ELSE 0 END) as up_count,
|
||||
SUM(CASE WHEN change_percent < 0 THEN 1 ELSE 0 END) as down_count,
|
||||
SUM(CASE WHEN change_percent = 0 THEN 1 ELSE 0 END) as flat_count,
|
||||
SUM(CASE WHEN change_percent >= 0.095 THEN 1 ELSE 0 END) as limit_up_count,
|
||||
SUM(CASE WHEN change_percent <= -0.095 THEN 1 ELSE 0 END) as limit_down_count,
|
||||
AVG(change_percent) as avg_change,
|
||||
SUM(volume) as total_volume,
|
||||
SUM(amount) as total_amount
|
||||
FROM kline_data
|
||||
WHERE kline_type = 'daily' AND trade_date = %s
|
||||
"""
|
||||
cursor.execute(query, (today,))
|
||||
stats = cursor.fetchone()
|
||||
|
||||
# 获取涨幅榜
|
||||
cursor.execute("""
|
||||
SELECT stock_code, change_percent, close_price, volume
|
||||
FROM kline_data
|
||||
WHERE kline_type = 'daily' AND trade_date = %s AND change_percent IS NOT NULL
|
||||
ORDER BY change_percent DESC
|
||||
LIMIT %s
|
||||
""", (today, limit))
|
||||
top_gainers = cursor.fetchall()
|
||||
|
||||
# 获取跌幅榜
|
||||
cursor.execute("""
|
||||
SELECT stock_code, change_percent, close_price, volume
|
||||
FROM kline_data
|
||||
WHERE kline_type = 'daily' AND trade_date = %s AND change_percent IS NOT NULL
|
||||
ORDER BY change_percent ASC
|
||||
LIMIT %s
|
||||
""", (today, limit))
|
||||
top_losers = cursor.fetchall()
|
||||
|
||||
# 获取成交量榜
|
||||
cursor.execute("""
|
||||
SELECT stock_code, volume, amount, change_percent, close_price
|
||||
FROM kline_data
|
||||
WHERE kline_type = 'daily' AND trade_date = %s
|
||||
ORDER BY volume DESC
|
||||
LIMIT %s
|
||||
""", (today, limit))
|
||||
volume_leaders = cursor.fetchall()
|
||||
|
||||
cursor.close()
|
||||
|
||||
return {
|
||||
'date': today,
|
||||
'statistics': stats,
|
||||
'top_gainers': top_gainers,
|
||||
'top_losers': top_losers,
|
||||
'volume_leaders': volume_leaders
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"获取市场概览失败: {e}")
|
||||
return {}
|
||||
|
||||
def clean_old_kline_data(self, days_to_keep: int = 365):
|
||||
"""清理旧的K线数据"""
|
||||
try:
|
||||
cutoff_date = (datetime.now() - timedelta(days=days_to_keep)).date()
|
||||
|
||||
with self.db_manager.get_connection() as conn:
|
||||
cursor = conn.cursor()
|
||||
|
||||
# 删除指定日期之前的数据
|
||||
query = "DELETE FROM kline_data WHERE trade_date < %s"
|
||||
cursor.execute(query, (cutoff_date,))
|
||||
deleted_count = cursor.rowcount
|
||||
|
||||
conn.commit()
|
||||
cursor.close()
|
||||
|
||||
self.logger.info(f"清理了 {deleted_count} 条旧的K线数据")
|
||||
return deleted_count
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"清理旧K线数据失败: {e}")
|
||||
return 0
|
||||
400
app/services/market_data_service.py
Normal file
400
app/services/market_data_service.py
Normal file
@ -0,0 +1,400 @@
|
||||
"""
|
||||
全市场股票数据服务
|
||||
获取和管理所有A股股票的基础数据、行业分类、K线数据等
|
||||
"""
|
||||
import pandas as pd
|
||||
import logging
|
||||
from datetime import datetime, date, timedelta
|
||||
from typing import List, Dict, Optional, Tuple
|
||||
from app import pro
|
||||
from app.dao import StockDAO
|
||||
from app.database import DatabaseManager
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class MarketDataService:
|
||||
def __init__(self):
|
||||
self.stock_dao = StockDAO()
|
||||
self.db_manager = DatabaseManager()
|
||||
self.logger = logging.getLogger(__name__)
|
||||
|
||||
def get_all_stock_list(self, force_refresh: bool = False) -> List[Dict]:
|
||||
"""获取所有A股股票列表"""
|
||||
try:
|
||||
# 如果不是强制刷新,先从数据库获取
|
||||
if not force_refresh:
|
||||
stocks = self._get_stock_list_from_db()
|
||||
if stocks:
|
||||
self.logger.info(f"从数据库获取到 {len(stocks)} 只股票")
|
||||
return stocks
|
||||
|
||||
# 从tushare获取最新的股票列表
|
||||
self.logger.info("从tushare获取股票列表...")
|
||||
return self._fetch_stock_list_from_api()
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"获取股票列表失败: {e}")
|
||||
return []
|
||||
|
||||
def _get_stock_list_from_db(self) -> List[Dict]:
|
||||
"""从数据库获取股票列表"""
|
||||
try:
|
||||
with self.db_manager.get_connection() as conn:
|
||||
cursor = conn.cursor(dictionary=True)
|
||||
query = """
|
||||
SELECT s.*, i.industry_name,
|
||||
GROUP_CONCAT(DISTINCT sec.sector_name) as sector_names
|
||||
FROM stocks s
|
||||
LEFT JOIN industries i ON s.industry_code = i.industry_code
|
||||
LEFT JOIN stock_sector_relations ssr ON s.stock_code = ssr.stock_code
|
||||
LEFT JOIN sectors sec ON ssr.sector_code = sec.sector_code
|
||||
WHERE s.is_active = TRUE
|
||||
GROUP BY s.stock_code
|
||||
ORDER BY s.stock_code
|
||||
"""
|
||||
cursor.execute(query)
|
||||
stocks = cursor.fetchall()
|
||||
cursor.close()
|
||||
return stocks
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"从数据库获取股票列表失败: {e}")
|
||||
return []
|
||||
|
||||
def _fetch_stock_list_from_api(self) -> List[Dict]:
|
||||
"""从tushare API获取股票列表"""
|
||||
try:
|
||||
all_stocks = []
|
||||
|
||||
# 获取A股列表
|
||||
stock_basic = pro.stock_basic(
|
||||
exchange='',
|
||||
list_status='L', # L代表上市
|
||||
fields='ts_code,symbol,name,area,industry,market,list_date'
|
||||
)
|
||||
|
||||
if stock_basic.empty:
|
||||
self.logger.warning("未获取到股票数据")
|
||||
return []
|
||||
|
||||
self.logger.info(f"获取到 {len(stock_basic)} 只股票基础信息")
|
||||
|
||||
# 处理每只股票
|
||||
for _, row in stock_basic.iterrows():
|
||||
try:
|
||||
stock_info = {
|
||||
'stock_code': row['symbol'], # 股票代码 (6位)
|
||||
'stock_name': row['name'],
|
||||
'market': row['market'], # 市场主板/创业板等
|
||||
'industry_code': self._map_industry_code(row['industry']),
|
||||
'area': row.get('area', ''),
|
||||
'list_date': pd.to_datetime(row['list_date']).date() if pd.notna(row['list_date']) else None,
|
||||
'market_type': self._get_market_type(row['symbol'], row['market']),
|
||||
'is_active': True
|
||||
}
|
||||
|
||||
# 保存到数据库
|
||||
self._save_stock_to_db(stock_info)
|
||||
all_stocks.append(stock_info)
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"处理股票 {row.get('symbol', 'unknown')} 失败: {e}")
|
||||
continue
|
||||
|
||||
self.logger.info(f"成功保存 {len(all_stocks)} 只股票到数据库")
|
||||
return all_stocks
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"从API获取股票列表失败: {e}")
|
||||
return []
|
||||
|
||||
def _save_stock_to_db(self, stock_info: Dict) -> bool:
|
||||
"""保存股票信息到数据库"""
|
||||
try:
|
||||
with self.db_manager.get_connection() as conn:
|
||||
cursor = conn.cursor()
|
||||
|
||||
# 使用INSERT ... ON DUPLICATE KEY UPDATE
|
||||
query = """
|
||||
INSERT INTO stocks (
|
||||
stock_code, stock_name, market, industry_code, area,
|
||||
list_date, market_type, is_active, created_at
|
||||
) VALUES (%s, %s, %s, %s, %s, %s, %s, %s, NOW())
|
||||
ON DUPLICATE KEY UPDATE
|
||||
stock_name = VALUES(stock_name),
|
||||
market = VALUES(market),
|
||||
industry_code = VALUES(industry_code),
|
||||
area = VALUES(area),
|
||||
list_date = VALUES(list_date),
|
||||
market_type = VALUES(market_type),
|
||||
is_active = VALUES(is_active),
|
||||
updated_at = NOW()
|
||||
"""
|
||||
|
||||
cursor.execute(query, (
|
||||
stock_info['stock_code'],
|
||||
stock_info['stock_name'],
|
||||
stock_info['market'],
|
||||
stock_info['industry_code'],
|
||||
stock_info['area'],
|
||||
stock_info['list_date'],
|
||||
stock_info['market_type'],
|
||||
stock_info['is_active']
|
||||
))
|
||||
|
||||
conn.commit()
|
||||
cursor.close()
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"保存股票信息失败: {stock_info['stock_code']}, 错误: {e}")
|
||||
return False
|
||||
|
||||
def _map_industry_code(self, industry_name: str) -> Optional[str]:
|
||||
"""将行业名称映射到行业代码"""
|
||||
if pd.isna(industry_name) or not industry_name:
|
||||
return None
|
||||
|
||||
industry_mapping = {
|
||||
'计算机': 'I09',
|
||||
'通信': 'I09',
|
||||
'软件和信息技术服务业': 'I09',
|
||||
'医药生物': 'Q17',
|
||||
'生物医药': 'Q17',
|
||||
'医疗器械': 'Q17',
|
||||
'电子': 'C03',
|
||||
'机械设备': 'C03',
|
||||
'化工': 'C03',
|
||||
'汽车': 'C03',
|
||||
'房地产': 'K11',
|
||||
'银行': 'J10',
|
||||
'非银金融': 'J10',
|
||||
'食品饮料': 'C03',
|
||||
'农林牧渔': 'A01',
|
||||
'采掘': 'B02',
|
||||
'钢铁': 'C03',
|
||||
'有色金属': 'C03',
|
||||
'建筑材料': 'C03',
|
||||
'建筑装饰': 'E05',
|
||||
'电气设备': 'C03',
|
||||
'国防军工': 'M13',
|
||||
'交通运输': 'G07',
|
||||
'公用事业': 'D04',
|
||||
'传媒': 'R18',
|
||||
'休闲服务': 'R18',
|
||||
'家用电器': 'C03',
|
||||
'纺织服装': 'C03',
|
||||
'轻工制造': 'C03',
|
||||
'商业贸易': 'F06',
|
||||
'综合': 'S19'
|
||||
}
|
||||
|
||||
# 精确匹配
|
||||
if industry_name in industry_mapping:
|
||||
return industry_mapping[industry_name]
|
||||
|
||||
# 模糊匹配
|
||||
for key, code in industry_mapping.items():
|
||||
if key in industry_name or industry_name in key:
|
||||
return code
|
||||
|
||||
return 'C03' # 默认制造业
|
||||
|
||||
def _get_market_type(self, stock_code: str, market: str) -> str:
|
||||
"""获取市场类型"""
|
||||
if stock_code.startswith('688'):
|
||||
return '科创板'
|
||||
elif stock_code.startswith('300'):
|
||||
return '创业板'
|
||||
elif stock_code.startswith('600') or stock_code.startswith('601') or stock_code.startswith('603') or stock_code.startswith('605'):
|
||||
return '主板'
|
||||
elif stock_code.startswith('000') or stock_code.startswith('001') or stock_code.startswith('002') or stock_code.startswith('003'):
|
||||
return '主板'
|
||||
elif stock_code.startswith('8') or stock_code.startswith('43'):
|
||||
return '新三板'
|
||||
else:
|
||||
return market or '其他'
|
||||
|
||||
def update_stock_sectors(self, stock_codes: List[str] = None) -> int:
|
||||
"""更新股票概念板块信息"""
|
||||
try:
|
||||
if stock_codes is None:
|
||||
# 获取所有股票
|
||||
stock_codes = [stock['stock_code'] for stock in self._get_stock_list_from_db()]
|
||||
|
||||
updated_count = 0
|
||||
total_count = len(stock_codes)
|
||||
|
||||
for stock_code in stock_codes:
|
||||
try:
|
||||
# 这里可以调用概念板块API获取股票所属概念
|
||||
# 由于tushare概念接口限制,这里先做一些基础映射
|
||||
self._update_stock_concepts(stock_code)
|
||||
updated_count += 1
|
||||
|
||||
if updated_count % 100 == 0:
|
||||
self.logger.info(f"已更新 {updated_count}/{total_count} 只股票的概念信息")
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"更新股票 {stock_code} 概念信息失败: {e}")
|
||||
continue
|
||||
|
||||
self.logger.info(f"完成更新 {updated_count} 只股票的概念信息")
|
||||
return updated_count
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"批量更新股票概念信息失败: {e}")
|
||||
return 0
|
||||
|
||||
def _update_stock_concepts(self, stock_code: str):
|
||||
"""更新单个股票的概念信息"""
|
||||
try:
|
||||
# 基于股票代码做一些基础的概念分类
|
||||
concepts = []
|
||||
|
||||
# 根据股票代码前缀推断概念
|
||||
if stock_code.startswith('688'):
|
||||
concepts.append('BK0500') # 半导体
|
||||
elif stock_code.startswith('300'):
|
||||
concepts.append('BK0896') # 国产软件
|
||||
concepts.append('BK0735') # 新基建
|
||||
|
||||
# 这里可以扩展更多的概念匹配逻辑
|
||||
# 也可以调用第三方API获取更准确的概念分类
|
||||
|
||||
if concepts:
|
||||
self._save_stock_concepts(stock_code, concepts)
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"更新股票 {stock_code} 概念失败: {e}")
|
||||
|
||||
def _save_stock_concepts(self, stock_code: str, concept_codes: List[str]):
|
||||
"""保存股票概念关联关系"""
|
||||
try:
|
||||
with self.db_manager.get_connection() as conn:
|
||||
cursor = conn.cursor()
|
||||
|
||||
# 先删除现有的概念关联
|
||||
cursor.execute("DELETE FROM stock_sector_relations WHERE stock_code = %s", (stock_code,))
|
||||
|
||||
# 添加新的概念关联
|
||||
for concept_code in concept_codes:
|
||||
query = """
|
||||
INSERT IGNORE INTO stock_sector_relations (stock_code, sector_code)
|
||||
VALUES (%s, %s)
|
||||
"""
|
||||
cursor.execute(query, (stock_code, concept_code))
|
||||
|
||||
conn.commit()
|
||||
cursor.close()
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"保存股票概念关联失败: {stock_code}, 错误: {e}")
|
||||
|
||||
def get_stock_by_industry(self, industry_code: str = None, limit: int = 100) -> List[Dict]:
|
||||
"""根据行业获取股票列表"""
|
||||
try:
|
||||
with self.db_manager.get_connection() as conn:
|
||||
cursor = conn.cursor(dictionary=True)
|
||||
|
||||
if industry_code:
|
||||
query = """
|
||||
SELECT s.*, i.industry_name
|
||||
FROM stocks s
|
||||
LEFT JOIN industries i ON s.industry_code = i.industry_code
|
||||
WHERE s.industry_code = %s AND s.is_active = TRUE
|
||||
ORDER BY s.stock_code
|
||||
LIMIT %s
|
||||
"""
|
||||
cursor.execute(query, (industry_code, limit))
|
||||
else:
|
||||
query = """
|
||||
SELECT s.*, i.industry_name
|
||||
FROM stocks s
|
||||
LEFT JOIN industries i ON s.industry_code = i.industry_code
|
||||
WHERE s.is_active = TRUE
|
||||
ORDER BY s.stock_code
|
||||
LIMIT %s
|
||||
"""
|
||||
cursor.execute(query, (limit,))
|
||||
|
||||
stocks = cursor.fetchall()
|
||||
cursor.close()
|
||||
return stocks
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"根据行业获取股票失败: {e}")
|
||||
return []
|
||||
|
||||
def get_stock_by_sector(self, sector_code: str, limit: int = 100) -> List[Dict]:
|
||||
"""根据概念板块获取股票列表"""
|
||||
try:
|
||||
with self.db_manager.get_connection() as conn:
|
||||
cursor = conn.cursor(dictionary=True)
|
||||
|
||||
query = """
|
||||
SELECT s.*, sec.sector_name
|
||||
FROM stocks s
|
||||
JOIN stock_sector_relations ssr ON s.stock_code = ssr.stock_code
|
||||
JOIN sectors sec ON ssr.sector_code = sec.sector_code
|
||||
WHERE ssr.sector_code = %s AND s.is_active = TRUE
|
||||
ORDER BY s.stock_code
|
||||
LIMIT %s
|
||||
"""
|
||||
cursor.execute(query, (sector_code, limit))
|
||||
|
||||
stocks = cursor.fetchall()
|
||||
cursor.close()
|
||||
return stocks
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"根据概念获取股票失败: {e}")
|
||||
return []
|
||||
|
||||
def get_industry_list(self) -> List[Dict]:
|
||||
"""获取所有行业列表"""
|
||||
try:
|
||||
with self.db_manager.get_connection() as conn:
|
||||
cursor = conn.cursor(dictionary=True)
|
||||
|
||||
query = """
|
||||
SELECT i.industry_code, i.industry_name, i.level,
|
||||
COUNT(s.stock_code) as stock_count
|
||||
FROM industries i
|
||||
LEFT JOIN stocks s ON i.industry_code = s.industry_code AND s.is_active = TRUE
|
||||
GROUP BY i.industry_code, i.industry_name, i.level
|
||||
ORDER BY i.industry_code
|
||||
"""
|
||||
cursor.execute(query)
|
||||
industries = cursor.fetchall()
|
||||
cursor.close()
|
||||
return industries
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"获取行业列表失败: {e}")
|
||||
return []
|
||||
|
||||
def get_sector_list(self) -> List[Dict]:
|
||||
"""获取所有概念板块列表"""
|
||||
try:
|
||||
with self.db_manager.get_connection() as conn:
|
||||
cursor = conn.cursor(dictionary=True)
|
||||
|
||||
query = """
|
||||
SELECT s.sector_code, s.sector_name, s.description,
|
||||
COUNT(ssr.stock_code) as stock_count
|
||||
FROM sectors s
|
||||
LEFT JOIN stock_sector_relations ssr ON s.sector_code = ssr.sector_code
|
||||
LEFT JOIN stocks st ON ssr.stock_code = st.stock_code AND st.is_active = TRUE
|
||||
GROUP BY s.sector_code, s.sector_name, s.description
|
||||
ORDER BY s.sector_code
|
||||
"""
|
||||
cursor.execute(query)
|
||||
sectors = cursor.fetchall()
|
||||
cursor.close()
|
||||
return sectors
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"获取概念板块列表失败: {e}")
|
||||
return []
|
||||
@ -1,587 +0,0 @@
|
||||
import json
|
||||
import os
|
||||
from datetime import datetime
|
||||
import pandas as pd
|
||||
from app import pro
|
||||
from app.config import Config
|
||||
import numpy as np
|
||||
|
||||
class StockService:
|
||||
def __init__(self):
|
||||
self.watchlist = {}
|
||||
self.cache_file = os.path.join(Config.BASE_DIR, "stock_cache.json")
|
||||
self.load_watchlist()
|
||||
self.load_cache()
|
||||
|
||||
def load_watchlist(self):
|
||||
try:
|
||||
if os.path.exists(Config.CONFIG_FILE):
|
||||
with open(Config.CONFIG_FILE, 'r', encoding='utf-8') as f:
|
||||
data = json.load(f)
|
||||
self.watchlist = data.get('watchlist', {})
|
||||
except Exception as e:
|
||||
print(f"Error loading watchlist: {str(e)}")
|
||||
self.watchlist = {}
|
||||
|
||||
def _save_watchlist(self):
|
||||
try:
|
||||
with open(Config.CONFIG_FILE, 'w', encoding='utf-8') as f:
|
||||
json.dump({'watchlist': self.watchlist}, f, ensure_ascii=False, indent=4)
|
||||
except Exception as e:
|
||||
print(f"Error saving watchlist: {str(e)}")
|
||||
|
||||
def load_cache(self):
|
||||
try:
|
||||
if os.path.exists(self.cache_file):
|
||||
with open(self.cache_file, 'r', encoding='utf-8') as f:
|
||||
self.cache_data = json.load(f)
|
||||
else:
|
||||
self.cache_data = {}
|
||||
except Exception as e:
|
||||
print(f"Error loading cache: {str(e)}")
|
||||
self.cache_data = {}
|
||||
|
||||
def save_cache(self, stock_code, data):
|
||||
try:
|
||||
self.cache_data[stock_code] = {
|
||||
'data': data,
|
||||
'timestamp': datetime.now().strftime('%Y-%m-%d')
|
||||
}
|
||||
with open(self.cache_file, 'w', encoding='utf-8') as f:
|
||||
json.dump(self.cache_data, f, ensure_ascii=False, indent=4)
|
||||
except Exception as e:
|
||||
print(f"Error saving cache: {str(e)}")
|
||||
|
||||
def get_stock_info(self, stock_code: str, force_refresh: bool = False):
|
||||
try:
|
||||
# 检查缓存
|
||||
today = datetime.now().strftime('%Y-%m-%d')
|
||||
if not force_refresh and stock_code in self.cache_data and self.cache_data[stock_code]['timestamp'] == today:
|
||||
print(f"从缓存获取股票 {stock_code} 的数据")
|
||||
cached_data = self.cache_data[stock_code]['data']
|
||||
cached_data['stock_info']['from_cache'] = True
|
||||
return cached_data
|
||||
|
||||
# 如果强制刷新或缓存不存在或已过期,从API获取数据
|
||||
print(f"从API获取股票 {stock_code} 的数据...")
|
||||
|
||||
# 处理股票代码格式
|
||||
if len(stock_code) != 6:
|
||||
return {"error": "股票代码格式错误"}
|
||||
|
||||
# 确定交易所
|
||||
if stock_code.startswith('6'):
|
||||
ts_code = f"{stock_code}.SH"
|
||||
elif stock_code.startswith(('0', '3')):
|
||||
ts_code = f"{stock_code}.SZ"
|
||||
else:
|
||||
return {"error": "不支持的股票代码"}
|
||||
|
||||
# 获取基本信息和总市值
|
||||
basic_info = pro.daily_basic(ts_code=ts_code, fields='ts_code,total_mv', limit=1)
|
||||
if basic_info.empty:
|
||||
return {"error": "股票代码不存在"}
|
||||
|
||||
# 获取股票名称
|
||||
stock_name = pro.stock_basic(ts_code=ts_code, fields='name').iloc[0]['name']
|
||||
|
||||
# 获取最新财务指标
|
||||
fina_indicator = pro.fina_indicator(ts_code=ts_code, period=datetime.now().strftime('%Y%m%d'), fields='roe,grossprofit_margin,netprofit_margin,debt_to_assets,op_income_yoy,netprofit_yoy,bps,ocfps')
|
||||
if fina_indicator.empty:
|
||||
fina_indicator = pro.fina_indicator(ts_code=ts_code, limit=1)
|
||||
|
||||
# 获取实时行情
|
||||
today = datetime.now().strftime('%Y%m%d')
|
||||
daily_data = pro.daily(ts_code=basic_info['ts_code'].iloc[0], start_date=today, end_date=today)
|
||||
if daily_data.empty:
|
||||
daily_data = pro.daily(ts_code=basic_info['ts_code'].iloc[0], limit=1)
|
||||
if daily_data.empty:
|
||||
return {"error": "无法获取股票行情数据"}
|
||||
|
||||
# 获取市值信息(用于其他指标)
|
||||
daily_basic = pro.daily_basic(ts_code=basic_info['ts_code'].iloc[0],
|
||||
fields='ts_code,trade_date,pe,pb,ps,dv_ratio',
|
||||
limit=1)
|
||||
|
||||
if daily_basic.empty:
|
||||
return {"error": "无法获取股票基础数据"}
|
||||
|
||||
latest_basic = daily_basic.iloc[0]
|
||||
latest_fina = fina_indicator.iloc[0] if not fina_indicator.empty else pd.Series()
|
||||
|
||||
# 计算实时总市值(单位:亿元)
|
||||
current_price = float(daily_data['close'].iloc[0])
|
||||
market_value = float(basic_info['total_mv'].iloc[0]) / 10000 # 转换为亿元
|
||||
print(f"市值计算: 当前价格={current_price}, 总市值={market_value}亿元")
|
||||
|
||||
# 处理股息率:tushare返回的是百分比值,需要转换为小数
|
||||
dv_ratio = float(latest_basic['dv_ratio']) if pd.notna(latest_basic['dv_ratio']) else 0
|
||||
dividend_yield = round(dv_ratio / 100, 4) # 转换为小数
|
||||
|
||||
# 处理财务指标,确保所有值都有默认值0,转换为小数
|
||||
roe = round(float(latest_fina['roe']) / 100, 4) if pd.notna(latest_fina.get('roe')) else 0
|
||||
gross_profit_margin = round(float(latest_fina['grossprofit_margin']) / 100, 4) if pd.notna(latest_fina.get('grossprofit_margin')) else 0
|
||||
net_profit_margin = round(float(latest_fina['netprofit_margin']) / 100, 4) if pd.notna(latest_fina.get('netprofit_margin')) else 0
|
||||
debt_to_assets = round(float(latest_fina['debt_to_assets']) / 100, 4) if pd.notna(latest_fina.get('debt_to_assets')) else 0
|
||||
revenue_yoy = round(float(latest_fina['op_income_yoy']) / 100, 4) if pd.notna(latest_fina.get('op_income_yoy')) else 0
|
||||
net_profit_yoy = round(float(latest_fina['netprofit_yoy']) / 100, 4) if pd.notna(latest_fina.get('netprofit_yoy')) else 0
|
||||
bps = round(float(latest_fina['bps']), 3) if pd.notna(latest_fina.get('bps')) else 0 # 保留3位小数
|
||||
ocfps = round(float(latest_fina['ocfps']), 3) if pd.notna(latest_fina.get('ocfps')) else 0 # 保留3位小数
|
||||
|
||||
stock_info = {
|
||||
"code": stock_code,
|
||||
"name": stock_name,
|
||||
"market_value": round(market_value, 2), # 总市值(亿元)
|
||||
"pe_ratio": round(float(latest_basic['pe']), 2) if pd.notna(latest_basic['pe']) else 0, # 市盈率
|
||||
"pb_ratio": round(float(latest_basic['pb']), 2) if pd.notna(latest_basic['pb']) else 0, # 市净率
|
||||
"ps_ratio": round(float(latest_basic['ps']), 2) if pd.notna(latest_basic['ps']) else 0, # 市销率
|
||||
"dividend_yield": dividend_yield, # 股息率(小数)
|
||||
"price": round(current_price, 2), # 股价保留2位小数
|
||||
"change_percent": round(float(daily_data['pct_chg'].iloc[0]) / 100, 4), # 涨跌幅转换为小数
|
||||
# 财务指标(全部转换为小数)
|
||||
"roe": roe, # ROE(小数)
|
||||
"gross_profit_margin": gross_profit_margin, # 毛利率(小数)
|
||||
"net_profit_margin": net_profit_margin, # 净利率(小数)
|
||||
"debt_to_assets": debt_to_assets, # 资产负债率(小数)
|
||||
"revenue_yoy": revenue_yoy, # 营收增长率(小数)
|
||||
"net_profit_yoy": net_profit_yoy, # 净利润增长率(小数)
|
||||
"bps": bps, # 每股净资产
|
||||
"ocfps": ocfps, # 每股经营现金流
|
||||
"from_cache": False
|
||||
}
|
||||
|
||||
# 获取目标值
|
||||
targets = self.watchlist.get(stock_code, {})
|
||||
|
||||
result = {
|
||||
"stock_info": stock_info,
|
||||
"targets": targets
|
||||
}
|
||||
|
||||
# 保存到缓存
|
||||
self.save_cache(stock_code, result)
|
||||
|
||||
return result
|
||||
except Exception as e:
|
||||
print(f"Error fetching stock info: {str(e)}")
|
||||
import traceback
|
||||
print(f"详细错误: {traceback.format_exc()}")
|
||||
return {"error": f"获取股票数据失败: {str(e)}"}
|
||||
|
||||
def get_watchlist(self):
|
||||
result = []
|
||||
for stock_code, targets in self.watchlist.items():
|
||||
try:
|
||||
# 从缓存获取数据
|
||||
today = datetime.now().strftime('%Y-%m-%d')
|
||||
if stock_code in self.cache_data and self.cache_data[stock_code]['timestamp'] == today:
|
||||
result.append(self.cache_data[stock_code]['data'])
|
||||
continue
|
||||
|
||||
# 如果没有缓存,只获取基本信息
|
||||
if stock_code.startswith('6'):
|
||||
ts_code = f"{stock_code}.SH"
|
||||
elif stock_code.startswith(('0', '3')):
|
||||
ts_code = f"{stock_code}.SZ"
|
||||
else:
|
||||
print(f"不支持的股票代码: {stock_code}")
|
||||
continue
|
||||
|
||||
# 获取股票名称
|
||||
stock_name = pro.stock_basic(ts_code=ts_code, fields='name').iloc[0]['name']
|
||||
|
||||
result.append({
|
||||
"stock_info": {
|
||||
"code": stock_code,
|
||||
"name": stock_name
|
||||
},
|
||||
"targets": targets
|
||||
})
|
||||
except Exception as e:
|
||||
print(f"Error getting watchlist info for {stock_code}: {str(e)}")
|
||||
continue
|
||||
return result
|
||||
|
||||
def add_watch(self, stock_code: str, target_market_value_min: float = None, target_market_value_max: float = None):
|
||||
self.watchlist[stock_code] = {
|
||||
"target_market_value": {
|
||||
"min": target_market_value_min,
|
||||
"max": target_market_value_max
|
||||
}
|
||||
}
|
||||
self._save_watchlist()
|
||||
return {"status": "success"}
|
||||
|
||||
def remove_watch(self, stock_code: str):
|
||||
if stock_code in self.watchlist:
|
||||
del self.watchlist[stock_code]
|
||||
# 同时删除缓存
|
||||
if stock_code in self.cache_data:
|
||||
del self.cache_data[stock_code]
|
||||
try:
|
||||
with open(self.cache_file, 'w', encoding='utf-8') as f:
|
||||
json.dump(self.cache_data, f, ensure_ascii=False, indent=4)
|
||||
except Exception as e:
|
||||
print(f"Error saving cache after removal: {str(e)}")
|
||||
self._save_watchlist()
|
||||
return {"status": "success"}
|
||||
|
||||
def update_target(self, stock_code: str, target_market_value_min: float = None, target_market_value_max: float = None):
|
||||
"""更新股票的目标市值"""
|
||||
if stock_code not in self.watchlist:
|
||||
return {"error": "股票不在监控列表中"}
|
||||
|
||||
self.watchlist[stock_code] = {
|
||||
"target_market_value": {
|
||||
"min": target_market_value_min,
|
||||
"max": target_market_value_max
|
||||
}
|
||||
}
|
||||
self._save_watchlist()
|
||||
return {"status": "success"}
|
||||
|
||||
def get_index_info(self):
|
||||
"""获取主要指数数据"""
|
||||
try:
|
||||
# 主要指数代码列表
|
||||
index_codes = {
|
||||
'000001.SH': '上证指数',
|
||||
'399001.SZ': '深证成指',
|
||||
'399006.SZ': '创业板指',
|
||||
'000016.SH': '上证50',
|
||||
'000300.SH': '沪深300',
|
||||
'000905.SH': '中证500',
|
||||
'000852.SH': '中证1000',
|
||||
'899050.BJ': '北证50',
|
||||
}
|
||||
|
||||
result = []
|
||||
for ts_code, name in index_codes.items():
|
||||
try:
|
||||
# 获取指数基本信息
|
||||
df = pro.index_daily(ts_code=ts_code, limit=1)
|
||||
if not df.empty:
|
||||
data = df.iloc[0]
|
||||
# 获取K线数据(最近20天)
|
||||
kline_df = pro.index_daily(ts_code=ts_code, limit=20)
|
||||
kline_data = []
|
||||
if not kline_df.empty:
|
||||
for _, row in kline_df.iterrows():
|
||||
kline_data.append({
|
||||
'date': row['trade_date'],
|
||||
'open': float(row['open']),
|
||||
'close': float(row['close']),
|
||||
'high': float(row['high']),
|
||||
'low': float(row['low']),
|
||||
'vol': float(row['vol'])
|
||||
})
|
||||
|
||||
result.append({
|
||||
'code': ts_code,
|
||||
'name': name,
|
||||
'price': float(data['close']),
|
||||
'change': float(data['pct_chg']),
|
||||
'kline_data': kline_data
|
||||
})
|
||||
except Exception as e:
|
||||
print(f"获取指数 {ts_code} 数据失败: {str(e)}")
|
||||
continue
|
||||
|
||||
return result
|
||||
except Exception as e:
|
||||
print(f"获取指数数据失败: {str(e)}")
|
||||
return []
|
||||
|
||||
def get_company_detail(self, stock_code: str):
|
||||
try:
|
||||
print(f"开始获取公司详情: {stock_code}")
|
||||
|
||||
# 处理股票代码格式
|
||||
if stock_code.startswith('6'):
|
||||
ts_code = f"{stock_code}.SH"
|
||||
elif stock_code.startswith(('0', '3')):
|
||||
ts_code = f"{stock_code}.SZ"
|
||||
else:
|
||||
print(f"不支持的股票代码格式: {stock_code}")
|
||||
return {"error": "不支持的股票代码"}
|
||||
|
||||
print(f"转换后的ts_code: {ts_code}")
|
||||
|
||||
# 获取公司基本信息
|
||||
basic = pro.stock_basic(ts_code=ts_code, fields='name,industry,area,list_date')
|
||||
if basic.empty:
|
||||
print(f"无法获取公司基本信息: {ts_code}")
|
||||
return {"error": "无法获取公司信息"}
|
||||
|
||||
company_info = basic.iloc[0]
|
||||
print(f"获取到的公司基本信息: {company_info.to_dict()}")
|
||||
|
||||
# 获取公司详细信息
|
||||
try:
|
||||
company_detail = pro.stock_company(ts_code=ts_code)
|
||||
if not company_detail.empty:
|
||||
detail_info = company_detail.iloc[0]
|
||||
company_detail_dict = {
|
||||
"com_name": str(detail_info.get('com_name', '')),
|
||||
"chairman": str(detail_info.get('chairman', '')),
|
||||
"manager": str(detail_info.get('manager', '')),
|
||||
"secretary": str(detail_info.get('secretary', '')),
|
||||
"reg_capital": float(detail_info.get('reg_capital', 0)) if pd.notna(detail_info.get('reg_capital')) else 0,
|
||||
"setup_date": str(detail_info.get('setup_date', '')),
|
||||
"province": str(detail_info.get('province', '')),
|
||||
"city": str(detail_info.get('city', '')),
|
||||
"introduction": str(detail_info.get('introduction', '')),
|
||||
"website": f"http://{str(detail_info.get('website', '')).strip('http://').strip('https://')}" if detail_info.get('website') else "",
|
||||
"email": str(detail_info.get('email', '')),
|
||||
"office": str(detail_info.get('office', '')),
|
||||
"employees": int(detail_info.get('employees', 0)) if pd.notna(detail_info.get('employees')) else 0,
|
||||
"main_business": str(detail_info.get('main_business', '')),
|
||||
"business_scope": str(detail_info.get('business_scope', ''))
|
||||
}
|
||||
else:
|
||||
company_detail_dict = {
|
||||
"com_name": "", "chairman": "", "manager": "", "secretary": "",
|
||||
"reg_capital": 0, "setup_date": "", "province": "", "city": "",
|
||||
"introduction": "", "website": "", "email": "", "office": "",
|
||||
"employees": 0, "main_business": "", "business_scope": ""
|
||||
}
|
||||
except Exception as e:
|
||||
print(f"获取公司详细信息失败: {str(e)}")
|
||||
company_detail_dict = {
|
||||
"com_name": "", "chairman": "", "manager": "", "secretary": "",
|
||||
"reg_capital": 0, "setup_date": "", "province": "", "city": "",
|
||||
"introduction": "", "website": "", "email": "", "office": "",
|
||||
"employees": 0, "main_business": "", "business_scope": ""
|
||||
}
|
||||
|
||||
# 获取最新财务指标
|
||||
try:
|
||||
fina = pro.fina_indicator(ts_code=ts_code, period=datetime.now().strftime('%Y%m%d'))
|
||||
if fina.empty:
|
||||
print("当前期间无财务数据,尝试获取最新一期数据")
|
||||
fina = pro.fina_indicator(ts_code=ts_code, limit=1)
|
||||
|
||||
if fina.empty:
|
||||
print(f"无法获取财务指标数据: {ts_code}")
|
||||
return {"error": "无法获取财务数据"}
|
||||
|
||||
fina_info = fina.iloc[0]
|
||||
print(f"获取到的财务指标: {fina_info.to_dict()}")
|
||||
except Exception as e:
|
||||
print(f"获取财务指标失败: {str(e)}")
|
||||
return {"error": "获取财务指标失败"}
|
||||
|
||||
# 获取市值信息(用于PE、PB等指标)
|
||||
try:
|
||||
daily_basic = pro.daily_basic(ts_code=ts_code, fields='pe,pb,ps,dv_ratio', limit=1)
|
||||
if not daily_basic.empty:
|
||||
latest_basic = daily_basic.iloc[0]
|
||||
else:
|
||||
print("无法获取PE/PB数据")
|
||||
latest_basic = pd.Series({'pe': 0, 'pb': 0, 'ps': 0, 'dv_ratio': 0})
|
||||
except Exception as e:
|
||||
print(f"获取PE/PB失败: {str(e)}")
|
||||
latest_basic = pd.Series({'pe': 0, 'pb': 0, 'ps': 0, 'dv_ratio': 0})
|
||||
|
||||
result = {
|
||||
"basic_info": {
|
||||
"name": str(company_info['name']),
|
||||
"industry": str(company_info['industry']),
|
||||
"list_date": str(company_info['list_date']),
|
||||
"area": str(company_info['area']),
|
||||
**company_detail_dict
|
||||
},
|
||||
"financial_info": {
|
||||
# 估值指标
|
||||
"pe_ratio": float(latest_basic['pe']) if pd.notna(latest_basic['pe']) else 0,
|
||||
"pb_ratio": float(latest_basic['pb']) if pd.notna(latest_basic['pb']) else 0,
|
||||
"ps_ratio": float(latest_basic['ps']) if pd.notna(latest_basic['ps']) else 0,
|
||||
"dividend_yield": float(latest_basic['dv_ratio'])/100 if pd.notna(latest_basic['dv_ratio']) else 0,
|
||||
|
||||
# 盈利能力
|
||||
"roe": float(fina_info['roe']) if pd.notna(fina_info.get('roe')) else 0,
|
||||
"roe_dt": float(fina_info['roe_dt']) if pd.notna(fina_info.get('roe_dt')) else 0,
|
||||
"roa": float(fina_info['roa']) if pd.notna(fina_info.get('roa')) else 0,
|
||||
"grossprofit_margin": float(fina_info['grossprofit_margin']) if pd.notna(fina_info.get('grossprofit_margin')) else 0,
|
||||
"netprofit_margin": float(fina_info['netprofit_margin']) if pd.notna(fina_info.get('netprofit_margin')) else 0,
|
||||
|
||||
# 成长能力
|
||||
"netprofit_yoy": float(fina_info['netprofit_yoy']) if pd.notna(fina_info.get('netprofit_yoy')) else 0,
|
||||
"dt_netprofit_yoy": float(fina_info['dt_netprofit_yoy']) if pd.notna(fina_info.get('dt_netprofit_yoy')) else 0,
|
||||
"tr_yoy": float(fina_info['tr_yoy']) if pd.notna(fina_info.get('tr_yoy')) else 0,
|
||||
"or_yoy": float(fina_info['or_yoy']) if pd.notna(fina_info.get('or_yoy')) else 0,
|
||||
|
||||
# 营运能力
|
||||
"assets_turn": float(fina_info['assets_turn']) if pd.notna(fina_info.get('assets_turn')) else 0,
|
||||
"inv_turn": float(fina_info['inv_turn']) if pd.notna(fina_info.get('inv_turn')) else 0,
|
||||
"ar_turn": float(fina_info['ar_turn']) if pd.notna(fina_info.get('ar_turn')) else 0,
|
||||
"ca_turn": float(fina_info['ca_turn']) if pd.notna(fina_info.get('ca_turn')) else 0,
|
||||
|
||||
# 偿债能力
|
||||
"current_ratio": float(fina_info['current_ratio']) if pd.notna(fina_info.get('current_ratio')) else 0,
|
||||
"quick_ratio": float(fina_info['quick_ratio']) if pd.notna(fina_info.get('quick_ratio')) else 0,
|
||||
"debt_to_assets": float(fina_info['debt_to_assets']) if pd.notna(fina_info.get('debt_to_assets')) else 0,
|
||||
"debt_to_eqt": float(fina_info['debt_to_eqt']) if pd.notna(fina_info.get('debt_to_eqt')) else 0,
|
||||
|
||||
# 现金流
|
||||
"ocf_to_or": float(fina_info['ocf_to_or']) if pd.notna(fina_info.get('ocf_to_or')) else 0,
|
||||
"ocf_to_opincome": float(fina_info['ocf_to_opincome']) if pd.notna(fina_info.get('ocf_to_opincome')) else 0,
|
||||
"ocf_yoy": float(fina_info['ocf_yoy']) if pd.notna(fina_info.get('ocf_yoy')) else 0,
|
||||
|
||||
# 每股指标
|
||||
"eps": float(fina_info['eps']) if pd.notna(fina_info.get('eps')) else 0,
|
||||
"dt_eps": float(fina_info['dt_eps']) if pd.notna(fina_info.get('dt_eps')) else 0,
|
||||
"bps": float(fina_info['bps']) if pd.notna(fina_info.get('bps')) else 0,
|
||||
"ocfps": float(fina_info['ocfps']) if pd.notna(fina_info.get('ocfps')) else 0,
|
||||
"retainedps": float(fina_info['retainedps']) if pd.notna(fina_info.get('retainedps')) else 0,
|
||||
"cfps": float(fina_info['cfps']) if pd.notna(fina_info.get('cfps')) else 0,
|
||||
"ebit_ps": float(fina_info['ebit_ps']) if pd.notna(fina_info.get('ebit_ps')) else 0,
|
||||
"fcff_ps": float(fina_info['fcff_ps']) if pd.notna(fina_info.get('fcff_ps')) else 0,
|
||||
"fcfe_ps": float(fina_info['fcfe_ps']) if pd.notna(fina_info.get('fcfe_ps')) else 0
|
||||
}
|
||||
}
|
||||
|
||||
print(f"返回结果: {result}")
|
||||
return result
|
||||
|
||||
except Exception as e:
|
||||
print(f"Error getting company detail: {str(e)}")
|
||||
import traceback
|
||||
print(f"详细错误: {traceback.format_exc()}")
|
||||
return {"error": f"获取公司详情失败: {str(e)}"}
|
||||
|
||||
def get_top_holders(self, stock_code: str):
|
||||
"""获取前十大股东数据"""
|
||||
try:
|
||||
# 处理股票代码格式
|
||||
if stock_code.startswith('6'):
|
||||
ts_code = f"{stock_code}.SH"
|
||||
elif stock_code.startswith(('0', '3')):
|
||||
ts_code = f"{stock_code}.SZ"
|
||||
else:
|
||||
return {"error": "不支持的股票代码"}
|
||||
|
||||
# 获取最新一期的股东数据
|
||||
df = pro.top10_holders(ts_code=ts_code, limit=10)
|
||||
if df.empty:
|
||||
return {"error": "暂无股东数据"}
|
||||
|
||||
# 按持股比例降序排序
|
||||
df = df.sort_values('hold_ratio', ascending=False)
|
||||
|
||||
# 获取最新的报告期
|
||||
latest_end_date = df['end_date'].max()
|
||||
latest_data = df[df['end_date'] == latest_end_date]
|
||||
|
||||
holders = []
|
||||
for _, row in latest_data.iterrows():
|
||||
holders.append({
|
||||
"holder_name": str(row['holder_name']),
|
||||
"hold_amount": float(row['hold_amount']) if pd.notna(row['hold_amount']) else 0,
|
||||
"hold_ratio": float(row['hold_ratio']) if pd.notna(row['hold_ratio']) else 0,
|
||||
"hold_change": float(row['hold_change']) if pd.notna(row['hold_change']) else 0,
|
||||
"ann_date": str(row['ann_date']),
|
||||
"end_date": str(row['end_date'])
|
||||
})
|
||||
|
||||
result = {
|
||||
"holders": holders,
|
||||
"total_ratio": sum(holder['hold_ratio'] for holder in holders), # 合计持股比例
|
||||
"report_date": str(latest_end_date) # 报告期
|
||||
}
|
||||
|
||||
return result
|
||||
|
||||
except Exception as e:
|
||||
print(f"获取股东数据失败: {str(e)}")
|
||||
import traceback
|
||||
print(f"详细错误: {traceback.format_exc()}")
|
||||
return {"error": f"获取股东数据失败: {str(e)}"}
|
||||
|
||||
def get_value_analysis_data(self, stock_code: str):
|
||||
"""获取价值投资分析所需的关键财务指标"""
|
||||
try:
|
||||
# 处理股票代码格式
|
||||
if stock_code.startswith('6'):
|
||||
ts_code = f"{stock_code}.SH"
|
||||
elif stock_code.startswith(('0', '3')):
|
||||
ts_code = f"{stock_code}.SZ"
|
||||
else:
|
||||
return {"error": "不支持的股票代码"}
|
||||
|
||||
# 获取最新每日指标(估值数据)
|
||||
daily_basic = pro.daily_basic(ts_code=ts_code, fields='pe,pb,ps,dv_ratio,total_mv', limit=1)
|
||||
if daily_basic.empty:
|
||||
return {"error": "无法获取股票估值数据"}
|
||||
|
||||
# 获取最新财务指标
|
||||
fina = pro.fina_indicator(ts_code=ts_code, fields='''roe,grossprofit_margin,netprofit_margin,
|
||||
netprofit_yoy,dt_netprofit_yoy,tr_yoy,or_yoy,assets_turn,inv_turn,ar_turn,current_ratio,
|
||||
quick_ratio,debt_to_assets,ocf_to_or,ocf_yoy,eps,bps,cfps,ocfps,retainedps''', limit=1)
|
||||
if fina.empty:
|
||||
return {"error": "无法获取财务指标数据"}
|
||||
|
||||
# 获取股票名称和当前价格
|
||||
basic_info = pro.daily(ts_code=ts_code, fields='close,trade_date', limit=1)
|
||||
stock_name = pro.stock_basic(ts_code=ts_code, fields='name').iloc[0]['name']
|
||||
|
||||
# 整合数据
|
||||
latest_daily = daily_basic.iloc[0]
|
||||
latest_fina = fina.iloc[0]
|
||||
latest_price = basic_info.iloc[0]
|
||||
|
||||
analysis_data = {
|
||||
"stock_info": {
|
||||
"code": stock_code,
|
||||
"name": stock_name,
|
||||
"current_price": float(latest_price['close']),
|
||||
"trade_date": str(latest_price['trade_date'])
|
||||
},
|
||||
"valuation": {
|
||||
"pe_ratio": float(latest_daily['pe']) if pd.notna(latest_daily['pe']) else None,
|
||||
"pb_ratio": float(latest_daily['pb']) if pd.notna(latest_daily['pb']) else None,
|
||||
"ps_ratio": float(latest_daily['ps']) if pd.notna(latest_daily['ps']) else None,
|
||||
"dividend_yield": float(latest_daily['dv_ratio'])/100 if pd.notna(latest_daily['dv_ratio']) else None,
|
||||
"total_market_value": float(latest_daily['total_mv'])/10000 if pd.notna(latest_daily['total_mv']) else None # 转换为亿元
|
||||
},
|
||||
"profitability": {
|
||||
"roe": float(latest_fina['roe'])/100 if pd.notna(latest_fina['roe']) else None,
|
||||
"gross_margin": float(latest_fina['grossprofit_margin'])/100 if pd.notna(latest_fina['grossprofit_margin']) else None,
|
||||
"net_margin": float(latest_fina['netprofit_margin'])/100 if pd.notna(latest_fina['netprofit_margin']) else None
|
||||
},
|
||||
"growth": {
|
||||
"net_profit_growth": float(latest_fina['netprofit_yoy'])/100 if pd.notna(latest_fina['netprofit_yoy']) else None,
|
||||
"deducted_net_profit_growth": float(latest_fina['dt_netprofit_yoy'])/100 if pd.notna(latest_fina['dt_netprofit_yoy']) else None,
|
||||
"revenue_growth": float(latest_fina['tr_yoy'])/100 if pd.notna(latest_fina['tr_yoy']) else None,
|
||||
"operating_revenue_growth": float(latest_fina['or_yoy'])/100 if pd.notna(latest_fina['or_yoy']) else None
|
||||
},
|
||||
"operation": {
|
||||
"asset_turnover": float(latest_fina['assets_turn']) if pd.notna(latest_fina['assets_turn']) else None,
|
||||
"inventory_turnover": float(latest_fina['inv_turn']) if pd.notna(latest_fina['inv_turn']) else None,
|
||||
"receivables_turnover": float(latest_fina['ar_turn']) if pd.notna(latest_fina['ar_turn']) else None
|
||||
},
|
||||
"solvency": {
|
||||
"current_ratio": float(latest_fina['current_ratio']) if pd.notna(latest_fina['current_ratio']) else None,
|
||||
"quick_ratio": float(latest_fina['quick_ratio']) if pd.notna(latest_fina['quick_ratio']) else None,
|
||||
"debt_to_assets": float(latest_fina['debt_to_assets'])/100 if pd.notna(latest_fina['debt_to_assets']) else None
|
||||
},
|
||||
"cash_flow": {
|
||||
"ocf_to_revenue": float(latest_fina['ocf_to_or'])/100 if pd.notna(latest_fina['ocf_to_or']) else None,
|
||||
"ocf_growth": float(latest_fina['ocf_yoy'])/100 if pd.notna(latest_fina['ocf_yoy']) else None
|
||||
},
|
||||
"per_share": {
|
||||
"eps": float(latest_fina['eps']) if pd.notna(latest_fina['eps']) else None,
|
||||
"bps": float(latest_fina['bps']) if pd.notna(latest_fina['bps']) else None,
|
||||
"cfps": float(latest_fina['cfps']) if pd.notna(latest_fina['cfps']) else None,
|
||||
"ocfps": float(latest_fina['ocfps']) if pd.notna(latest_fina['ocfps']) else None,
|
||||
"retained_eps": float(latest_fina['retainedps']) if pd.notna(latest_fina['retainedps']) else None
|
||||
}
|
||||
}
|
||||
|
||||
return analysis_data
|
||||
|
||||
except Exception as e:
|
||||
print(f"获取价值投资分析数据失败: {str(e)}")
|
||||
import traceback
|
||||
print(f"详细错误: {traceback.format_exc()}")
|
||||
return {"error": f"获取价值投资分析数据失败: {str(e)}"}
|
||||
603
app/services/stock_service_db.py
Normal file
603
app/services/stock_service_db.py
Normal file
@ -0,0 +1,603 @@
|
||||
"""
|
||||
基于数据库的股票服务
|
||||
"""
|
||||
import pandas as pd
|
||||
from datetime import datetime, date
|
||||
from app import pro
|
||||
from app.dao import StockDAO, WatchlistDAO, ConfigDAO
|
||||
from app.config import Config
|
||||
import logging
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class StockServiceDB:
|
||||
def __init__(self):
|
||||
self.stock_dao = StockDAO()
|
||||
self.watchlist_dao = WatchlistDAO()
|
||||
self.config_dao = ConfigDAO()
|
||||
self.logger = logging.getLogger(__name__)
|
||||
|
||||
def get_stock_info(self, stock_code: str, force_refresh: bool = False):
|
||||
"""获取股票信息"""
|
||||
try:
|
||||
today = self.stock_dao.get_today_date()
|
||||
|
||||
# 检查缓存
|
||||
if not force_refresh:
|
||||
cached_data = self.stock_dao.get_stock_data(stock_code, today)
|
||||
if cached_data:
|
||||
self.logger.info(f"从数据库获取股票 {stock_code} 的数据")
|
||||
return self._format_stock_data(cached_data, self.watchlist_dao.get_watchlist_item(stock_code))
|
||||
|
||||
# 从API获取数据
|
||||
self.logger.info(f"从API获取股票 {stock_code} 的数据...")
|
||||
api_data = self._fetch_stock_from_api(stock_code)
|
||||
if 'error' in api_data:
|
||||
return api_data
|
||||
|
||||
# 保存到数据库
|
||||
stock_info = api_data['stock_info']
|
||||
success = self.stock_dao.save_stock_data(stock_code, stock_info, today)
|
||||
if not success:
|
||||
self.logger.warning(f"保存股票数据失败: {stock_code}")
|
||||
|
||||
# 获取目标值
|
||||
targets = self.watchlist_dao.get_watchlist_item(stock_code)
|
||||
if targets:
|
||||
api_data['targets'] = {
|
||||
"target_market_value": {
|
||||
"min": targets['target_market_value_min'],
|
||||
"max": targets['target_market_value_max']
|
||||
}
|
||||
}
|
||||
|
||||
return api_data
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"获取股票信息失败: {stock_code}, 错误: {e}")
|
||||
return {"error": f"获取股票数据失败: {str(e)}"}
|
||||
|
||||
def _fetch_stock_from_api(self, stock_code: str):
|
||||
"""从API获取股票数据"""
|
||||
try:
|
||||
# 验证股票代码格式
|
||||
if len(stock_code) != 6:
|
||||
return {"error": "股票代码格式错误"}
|
||||
|
||||
# 确定交易所
|
||||
if stock_code.startswith('6'):
|
||||
ts_code = f"{stock_code}.SH"
|
||||
elif stock_code.startswith(('0', '3')):
|
||||
ts_code = f"{stock_code}.SZ"
|
||||
else:
|
||||
return {"error": "不支持的股票代码"}
|
||||
|
||||
# 获取基本信息
|
||||
basic_info = pro.daily_basic(ts_code=ts_code, fields='ts_code,total_mv', limit=1)
|
||||
if basic_info.empty:
|
||||
return {"error": "股票代码不存在"}
|
||||
|
||||
# 获取股票名称
|
||||
stock_name = pro.stock_basic(ts_code=ts_code, fields='name').iloc[0]['name']
|
||||
|
||||
# 确保股票信息在数据库中
|
||||
market = 'SH' if stock_code.startswith('6') else 'SZ'
|
||||
self.stock_dao.add_or_update_stock(stock_code, stock_name, market)
|
||||
|
||||
# 获取最新财务指标
|
||||
fina_indicator = pro.fina_indicator(
|
||||
ts_code=ts_code,
|
||||
period=datetime.now().strftime('%Y%m%d'),
|
||||
fields='roe,grossprofit_margin,netprofit_margin,debt_to_assets,op_income_yoy,netprofit_yoy,bps,ocfps'
|
||||
)
|
||||
if fina_indicator.empty:
|
||||
fina_indicator = pro.fina_indicator(ts_code=ts_code, limit=1)
|
||||
|
||||
# 获取实时行情
|
||||
today = datetime.now().strftime('%Y%m%d')
|
||||
daily_data = pro.daily(ts_code=basic_info['ts_code'].iloc[0], start_date=today, end_date=today)
|
||||
if daily_data.empty:
|
||||
daily_data = pro.daily(ts_code=basic_info['ts_code'].iloc[0], limit=1)
|
||||
if daily_data.empty:
|
||||
return {"error": "无法获取股票行情数据"}
|
||||
|
||||
# 获取估值指标
|
||||
daily_basic = pro.daily_basic(
|
||||
ts_code=basic_info['ts_code'].iloc[0],
|
||||
fields='ts_code,trade_date,pe,pe_ttm,pb,ps,dv_ratio',
|
||||
limit=1
|
||||
)
|
||||
|
||||
if daily_basic.empty:
|
||||
return {"error": "无法获取股票基础数据"}
|
||||
|
||||
latest_basic = daily_basic.iloc[0]
|
||||
latest_fina = fina_indicator.iloc[0] if not fina_indicator.empty else pd.Series()
|
||||
|
||||
# 计算市值
|
||||
current_price = float(daily_data['close'].iloc[0])
|
||||
market_value = float(basic_info['total_mv'].iloc[0]) / 10000
|
||||
|
||||
# 处理各种指标
|
||||
dv_ratio = float(latest_basic['dv_ratio']) if pd.notna(latest_basic['dv_ratio']) else 0
|
||||
dividend_yield = round(dv_ratio / 100, 4)
|
||||
|
||||
stock_info = {
|
||||
"code": stock_code,
|
||||
"name": stock_name,
|
||||
"market_value": round(market_value, 2),
|
||||
"pe_ratio": round(float(latest_basic['pe']), 2) if pd.notna(latest_basic['pe']) else 0,
|
||||
"pe_ttm": round(float(latest_basic['pe_ttm']), 2) if pd.notna(latest_basic.get('pe_ttm')) else 0,
|
||||
"pb_ratio": round(float(latest_basic['pb']), 2) if pd.notna(latest_basic['pb']) else 0,
|
||||
"ps_ratio": round(float(latest_basic['ps']), 2) if pd.notna(latest_basic['ps']) else 0,
|
||||
"dividend_yield": dividend_yield,
|
||||
"price": round(current_price, 2),
|
||||
"change_percent": round(float(daily_data['pct_chg'].iloc[0]) / 100, 4),
|
||||
# 财务指标
|
||||
"roe": round(float(latest_fina['roe']) / 100, 4) if pd.notna(latest_fina.get('roe')) else 0,
|
||||
"gross_profit_margin": round(float(latest_fina['grossprofit_margin']) / 100, 4) if pd.notna(latest_fina.get('grossprofit_margin')) else 0,
|
||||
"net_profit_margin": round(float(latest_fina['netprofit_margin']) / 100, 4) if pd.notna(latest_fina.get('netprofit_margin')) else 0,
|
||||
"debt_to_assets": round(float(latest_fina['debt_to_assets']) / 100, 4) if pd.notna(latest_fina.get('debt_to_assets')) else 0,
|
||||
"revenue_yoy": round(float(latest_fina['op_income_yoy']) / 100, 4) if pd.notna(latest_fina.get('op_income_yoy')) else 0,
|
||||
"net_profit_yoy": round(float(latest_fina['netprofit_yoy']) / 100, 4) if pd.notna(latest_fina.get('netprofit_yoy')) else 0,
|
||||
"bps": round(float(latest_fina['bps']), 3) if pd.notna(latest_fina.get('bps')) else 0,
|
||||
"ocfps": round(float(latest_fina['ocfps']), 3) if pd.notna(latest_fina.get('ocfps')) else 0,
|
||||
"from_cache": False
|
||||
}
|
||||
|
||||
return {"stock_info": stock_info, "targets": {}}
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"从API获取股票数据失败: {stock_code}, 错误: {e}")
|
||||
return {"error": f"获取股票数据失败: {str(e)}"}
|
||||
|
||||
def _format_stock_data(self, stock_data: dict, watchlist_item: dict = None):
|
||||
"""格式化股票数据为API返回格式"""
|
||||
stock_info = {
|
||||
"code": stock_data['stock_code'],
|
||||
"name": stock_data['stock_name'],
|
||||
"market_value": stock_data['market_value'],
|
||||
"pe_ratio": stock_data['pe_ratio'],
|
||||
"pb_ratio": stock_data['pb_ratio'],
|
||||
"ps_ratio": stock_data['ps_ratio'],
|
||||
"dividend_yield": stock_data['dividend_yield'],
|
||||
"price": stock_data['price'],
|
||||
"change_percent": stock_data['change_percent'],
|
||||
"roe": stock_data['roe'],
|
||||
"gross_profit_margin": stock_data['gross_profit_margin'],
|
||||
"net_profit_margin": stock_data['net_profit_margin'],
|
||||
"debt_to_assets": stock_data['debt_to_assets'],
|
||||
"revenue_yoy": stock_data['revenue_yoy'],
|
||||
"net_profit_yoy": stock_data['net_profit_yoy'],
|
||||
"bps": stock_data['bps'],
|
||||
"ocfps": stock_data['ocfps'],
|
||||
"from_cache": stock_data['from_cache']
|
||||
}
|
||||
|
||||
targets = {}
|
||||
if watchlist_item:
|
||||
targets = {
|
||||
"target_market_value": {
|
||||
"min": watchlist_item['target_market_value_min'],
|
||||
"max": watchlist_item['target_market_value_max']
|
||||
}
|
||||
}
|
||||
|
||||
return {"stock_info": stock_info, "targets": targets}
|
||||
|
||||
def get_watchlist(self):
|
||||
"""获取监控列表"""
|
||||
try:
|
||||
watchlist_data = self.watchlist_dao.get_watchlist_with_data()
|
||||
result = []
|
||||
|
||||
for item in watchlist_data:
|
||||
# 处理股票信息
|
||||
stock_info = {
|
||||
"code": item['stock_code'],
|
||||
"name": item['stock_name']
|
||||
}
|
||||
|
||||
# 如果有股票数据,添加更多信息
|
||||
if item.get('price') is not None:
|
||||
stock_info.update({
|
||||
"price": float(item['price']) if item['price'] else None,
|
||||
"change_percent": float(item['change_percent']) if item['change_percent'] else None,
|
||||
"market_value": float(item['current_market_value']) if item['current_market_value'] else None,
|
||||
"pe_ratio": float(item['pe_ratio']) if item['pe_ratio'] else None,
|
||||
"pb_ratio": float(item['pb_ratio']) if item['pb_ratio'] else None,
|
||||
"from_cache": bool(item.get('from_cache', False))
|
||||
})
|
||||
|
||||
# 处理目标市值
|
||||
targets = {}
|
||||
if item.get('target_market_value_min') is not None or item.get('target_market_value_max') is not None:
|
||||
targets["target_market_value"] = {
|
||||
"min": item.get('target_market_value_min'),
|
||||
"max": item.get('target_market_value_max')
|
||||
}
|
||||
|
||||
result.append({
|
||||
"stock_info": stock_info,
|
||||
"targets": targets
|
||||
})
|
||||
|
||||
return result
|
||||
except Exception as e:
|
||||
self.logger.error(f"获取监控列表失败: {e}")
|
||||
return []
|
||||
|
||||
def add_watch(self, stock_code: str, target_market_value_min: float = None, target_market_value_max: float = None):
|
||||
"""添加股票到监控列表"""
|
||||
try:
|
||||
success = self.watchlist_dao.add_to_watchlist(
|
||||
stock_code, target_market_value_min, target_market_value_max
|
||||
)
|
||||
return {"status": "success" if success else "failed"}
|
||||
except Exception as e:
|
||||
self.logger.error(f"添加监控股票失败: {stock_code}, 错误: {e}")
|
||||
return {"error": f"添加监控股票失败: {str(e)}"}
|
||||
|
||||
def remove_watch(self, stock_code: str):
|
||||
"""从监控列表移除股票"""
|
||||
try:
|
||||
success = self.watchlist_dao.remove_from_watchlist(stock_code)
|
||||
return {"status": "success" if success else "failed"}
|
||||
except Exception as e:
|
||||
self.logger.error(f"移除监控股票失败: {stock_code}, 错误: {e}")
|
||||
return {"error": f"移除监控股票失败: {str(e)}"}
|
||||
|
||||
def update_target(self, stock_code: str, target_market_value_min: float = None, target_market_value_max: float = None):
|
||||
"""更新股票的目标市值"""
|
||||
try:
|
||||
success = self.watchlist_dao.update_watchlist_item(
|
||||
stock_code, target_market_value_min, target_market_value_max
|
||||
)
|
||||
return {"status": "success" if success else "failed"}
|
||||
except Exception as e:
|
||||
self.logger.error(f"更新目标市值失败: {stock_code}, 错误: {e}")
|
||||
return {"error": f"更新目标市值失败: {str(e)}"}
|
||||
|
||||
def get_index_info(self):
|
||||
"""获取主要指数数据(此功能保持不变,不需要数据库存储)"""
|
||||
try:
|
||||
index_codes = {
|
||||
'000001.SH': '上证指数',
|
||||
'399001.SZ': '深证成指',
|
||||
'399006.SZ': '创业板指',
|
||||
'000016.SH': '上证50',
|
||||
'000300.SH': '沪深300',
|
||||
'000905.SH': '中证500',
|
||||
'000852.SH': '中证1000',
|
||||
'899050.BJ': '北证50',
|
||||
}
|
||||
|
||||
result = []
|
||||
for ts_code, name in index_codes.items():
|
||||
try:
|
||||
df = pro.index_daily(ts_code=ts_code, limit=1)
|
||||
if not df.empty:
|
||||
data = df.iloc[0]
|
||||
# 获取K线数据(最近20天)
|
||||
kline_df = pro.index_daily(ts_code=ts_code, limit=20)
|
||||
kline_data = []
|
||||
if not kline_df.empty:
|
||||
for _, row in kline_df.iterrows():
|
||||
kline_data.append({
|
||||
'date': row['trade_date'],
|
||||
'open': float(row['open']),
|
||||
'close': float(row['close']),
|
||||
'high': float(row['high']),
|
||||
'low': float(row['low']),
|
||||
'vol': float(row['vol'])
|
||||
})
|
||||
|
||||
result.append({
|
||||
'code': ts_code,
|
||||
'name': name,
|
||||
'price': float(data['close']),
|
||||
'change': float(data['pct_chg']),
|
||||
'kline_data': kline_data
|
||||
})
|
||||
except Exception as e:
|
||||
self.logger.error(f"获取指数 {ts_code} 数据失败: {str(e)}")
|
||||
continue
|
||||
|
||||
return result
|
||||
except Exception as e:
|
||||
self.logger.error(f"获取指数数据失败: {str(e)}")
|
||||
return []
|
||||
|
||||
def batch_update_watchlist_data(self):
|
||||
"""批量更新监控列表的股票数据"""
|
||||
try:
|
||||
# 获取需要更新的股票
|
||||
stocks_to_update = self.watchlist_dao.get_stocks_needing_update()
|
||||
updated_count = 0
|
||||
failed_count = 0
|
||||
|
||||
for stock_code in stocks_to_update:
|
||||
try:
|
||||
result = self.get_stock_info(stock_code, force_refresh=True)
|
||||
if 'error' not in result:
|
||||
updated_count += 1
|
||||
else:
|
||||
failed_count += 1
|
||||
except Exception as e:
|
||||
self.logger.error(f"更新股票数据失败: {stock_code}, 错误: {e}")
|
||||
failed_count += 1
|
||||
|
||||
# 更新最后更新日期
|
||||
self.config_dao.set_last_data_update_date(self.stock_dao.get_today_date())
|
||||
|
||||
return {
|
||||
"total": len(stocks_to_update),
|
||||
"updated": updated_count,
|
||||
"failed": failed_count
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"批量更新监控列表数据失败: {e}")
|
||||
return {"error": f"批量更新失败: {str(e)}"}
|
||||
|
||||
# 保持原有的其他方法不变,这些方法不需要数据库存储
|
||||
def get_company_detail(self, stock_code: str):
|
||||
"""获取公司详情(从API获取,实时数据)"""
|
||||
try:
|
||||
# 处理股票代码格式
|
||||
if stock_code.startswith('6'):
|
||||
ts_code = f"{stock_code}.SH"
|
||||
elif stock_code.startswith(('0', '3')):
|
||||
ts_code = f"{stock_code}.SZ"
|
||||
else:
|
||||
return {"error": "不支持的股票代码"}
|
||||
|
||||
# 获取公司基本信息
|
||||
basic = pro.stock_basic(ts_code=ts_code, fields='name,industry,area,list_date')
|
||||
if basic.empty:
|
||||
return {"error": "无法获取公司信息"}
|
||||
|
||||
company_info = basic.iloc[0]
|
||||
|
||||
# 获取公司详细信息
|
||||
try:
|
||||
company_detail = pro.stock_company(ts_code=ts_code)
|
||||
if not company_detail.empty:
|
||||
detail_info = company_detail.iloc[0]
|
||||
company_detail_dict = {
|
||||
"com_name": str(detail_info.get('com_name', '')),
|
||||
"chairman": str(detail_info.get('chairman', '')),
|
||||
"manager": str(detail_info.get('manager', '')),
|
||||
"secretary": str(detail_info.get('secretary', '')),
|
||||
"reg_capital": float(detail_info.get('reg_capital', 0)) if pd.notna(detail_info.get('reg_capital')) else 0,
|
||||
"setup_date": str(detail_info.get('setup_date', '')),
|
||||
"province": str(detail_info.get('province', '')),
|
||||
"city": str(detail_info.get('city', '')),
|
||||
"introduction": str(detail_info.get('introduction', '')),
|
||||
"website": f"http://{str(detail_info.get('website', '')).strip('http://').strip('https://')}" if detail_info.get('website') else "",
|
||||
"email": str(detail_info.get('email', '')),
|
||||
"office": str(detail_info.get('office', '')),
|
||||
"employees": int(detail_info.get('employees', 0)) if pd.notna(detail_info.get('employees')) else 0,
|
||||
"main_business": str(detail_info.get('main_business', '')),
|
||||
"business_scope": str(detail_info.get('business_scope', ''))
|
||||
}
|
||||
else:
|
||||
company_detail_dict = {
|
||||
"com_name": "", "chairman": "", "manager": "", "secretary": "",
|
||||
"reg_capital": 0, "setup_date": "", "province": "", "city": "",
|
||||
"introduction": "", "website": "", "email": "", "office": "",
|
||||
"employees": 0, "main_business": "", "business_scope": ""
|
||||
}
|
||||
except Exception as e:
|
||||
self.logger.error(f"获取公司详细信息失败: {str(e)}")
|
||||
company_detail_dict = {
|
||||
"com_name": "", "chairman": "", "manager": "", "secretary": "",
|
||||
"reg_capital": 0, "setup_date": "", "province": "", "city": "",
|
||||
"introduction": "", "website": "", "email": "", "office": "",
|
||||
"employees": 0, "main_business": "", "business_scope": ""
|
||||
}
|
||||
|
||||
# 获取最新财务指标
|
||||
try:
|
||||
fina = pro.fina_indicator(ts_code=ts_code, period=datetime.now().strftime('%Y%m%d'))
|
||||
if fina.empty:
|
||||
fina = pro.fina_indicator(ts_code=ts_code, limit=1)
|
||||
|
||||
if fina.empty:
|
||||
return {"error": "无法获取财务数据"}
|
||||
|
||||
fina_info = fina.iloc[0]
|
||||
except Exception as e:
|
||||
self.logger.error(f"获取财务指标失败: {str(e)}")
|
||||
return {"error": "获取财务指标失败"}
|
||||
|
||||
# 获取市值信息(用于PE、PB等指标)
|
||||
try:
|
||||
daily_basic = pro.daily_basic(ts_code=ts_code, fields='pe,pb,ps,dv_ratio', limit=1)
|
||||
if not daily_basic.empty:
|
||||
latest_basic = daily_basic.iloc[0]
|
||||
else:
|
||||
latest_basic = pd.Series({'pe': 0, 'pb': 0, 'ps': 0, 'dv_ratio': 0})
|
||||
except Exception as e:
|
||||
self.logger.error(f"获取PE/PB失败: {str(e)}")
|
||||
latest_basic = pd.Series({'pe': 0, 'pb': 0, 'ps': 0, 'dv_ratio': 0})
|
||||
|
||||
result = {
|
||||
"basic_info": {
|
||||
"name": str(company_info['name']),
|
||||
"industry": str(company_info['industry']),
|
||||
"list_date": str(company_info['list_date']),
|
||||
"area": str(company_info['area']),
|
||||
**company_detail_dict
|
||||
},
|
||||
"financial_info": {
|
||||
# 估值指标
|
||||
"pe_ratio": float(latest_basic['pe']) if pd.notna(latest_basic['pe']) else 0,
|
||||
"pb_ratio": float(latest_basic['pb']) if pd.notna(latest_basic['pb']) else 0,
|
||||
"ps_ratio": float(latest_basic['ps']) if pd.notna(latest_basic['ps']) else 0,
|
||||
"dividend_yield": float(latest_basic['dv_ratio'])/100 if pd.notna(latest_basic['dv_ratio']) else 0,
|
||||
|
||||
# 盈利能力
|
||||
"roe": float(fina_info['roe']) if pd.notna(fina_info.get('roe')) else 0,
|
||||
"grossprofit_margin": float(fina_info['grossprofit_margin']) if pd.notna(fina_info.get('grossprofit_margin')) else 0,
|
||||
"netprofit_margin": float(fina_info['netprofit_margin']) if pd.notna(fina_info.get('netprofit_margin')) else 0,
|
||||
|
||||
# 成长能力
|
||||
"netprofit_yoy": float(fina_info['netprofit_yoy']) if pd.notna(fina_info.get('netprofit_yoy')) else 0,
|
||||
"or_yoy": float(fina_info['or_yoy']) if pd.notna(fina_info.get('or_yoy')) else 0,
|
||||
|
||||
# 偿债能力
|
||||
"debt_to_assets": float(fina_info['debt_to_assets']) if pd.notna(fina_info.get('debt_to_assets')) else 0,
|
||||
|
||||
# 每股指标
|
||||
"eps": float(fina_info['eps']) if pd.notna(fina_info.get('eps')) else 0,
|
||||
"bps": float(fina_info['bps']) if pd.notna(fina_info.get('bps')) else 0,
|
||||
"ocfps": float(fina_info['ocfps']) if pd.notna(fina_info.get('ocfps')) else 0,
|
||||
}
|
||||
}
|
||||
|
||||
return result
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"获取公司详情失败: {stock_code}, 错误: {e}")
|
||||
return {"error": f"获取公司详情失败: {str(e)}"}
|
||||
|
||||
def get_top_holders(self, stock_code: str):
|
||||
"""获取前十大股东数据(从API获取,实时数据)"""
|
||||
try:
|
||||
# 处理股票代码格式
|
||||
if stock_code.startswith('6'):
|
||||
ts_code = f"{stock_code}.SH"
|
||||
elif stock_code.startswith(('0', '3')):
|
||||
ts_code = f"{stock_code}.SZ"
|
||||
else:
|
||||
return {"error": "不支持的股票代码"}
|
||||
|
||||
# 获取最新一期的股东数据
|
||||
df = pro.top10_holders(ts_code=ts_code, limit=10)
|
||||
if df.empty:
|
||||
return {"error": "暂无股东数据"}
|
||||
|
||||
# 按持股比例降序排序
|
||||
df = df.sort_values('hold_ratio', ascending=False)
|
||||
|
||||
# 获取最新的报告期
|
||||
latest_end_date = df['end_date'].max()
|
||||
latest_data = df[df['end_date'] == latest_end_date]
|
||||
|
||||
holders = []
|
||||
for _, row in latest_data.iterrows():
|
||||
holders.append({
|
||||
"holder_name": str(row['holder_name']),
|
||||
"hold_amount": float(row['hold_amount']) if pd.notna(row['hold_amount']) else 0,
|
||||
"hold_ratio": float(row['hold_ratio']) if pd.notna(row['hold_ratio']) else 0,
|
||||
"hold_change": float(row['hold_change']) if pd.notna(row['hold_change']) else 0,
|
||||
"ann_date": str(row['ann_date']),
|
||||
"end_date": str(row['end_date'])
|
||||
})
|
||||
|
||||
result = {
|
||||
"holders": holders,
|
||||
"total_ratio": sum(holder['hold_ratio'] for holder in holders),
|
||||
"report_date": str(latest_end_date)
|
||||
}
|
||||
|
||||
return result
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"获取股东数据失败: {stock_code}, 错误: {e}")
|
||||
return {"error": f"获取股东数据失败: {str(e)}"}
|
||||
|
||||
def get_value_analysis_data(self, stock_code: str):
|
||||
"""获取价值投资分析数据(优先从数据库,如果没有则从API获取)"""
|
||||
try:
|
||||
# 先尝试从数据库获取今日数据
|
||||
today = self.stock_dao.get_today_date()
|
||||
cached_data = self.stock_dao.get_stock_data(stock_code, today)
|
||||
|
||||
if cached_data and not cached_data['from_cache']:
|
||||
# 如果有今日的API数据(非缓存),直接使用
|
||||
return self._format_value_analysis_data(cached_data)
|
||||
|
||||
# 否则从API获取
|
||||
api_result = self.get_stock_info(stock_code, force_refresh=True)
|
||||
if 'error' in api_result:
|
||||
return api_result
|
||||
|
||||
stock_info = api_result['stock_info']
|
||||
return self._format_value_analysis_data_from_info(stock_info)
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"获取价值投资分析数据失败: {stock_code}, 错误: {e}")
|
||||
return {"error": f"获取价值投资分析数据失败: {str(e)}"}
|
||||
|
||||
def _format_value_analysis_data(self, stock_data: dict):
|
||||
"""格式化价值投资分析数据"""
|
||||
return {
|
||||
"stock_info": {
|
||||
"code": stock_data['stock_code'],
|
||||
"name": stock_data['stock_name'],
|
||||
"current_price": stock_data['price'],
|
||||
"trade_date": stock_data.get('data_date', self.stock_dao.get_today_date())
|
||||
},
|
||||
"valuation": {
|
||||
"pe_ratio": stock_data['pe_ratio'],
|
||||
"pb_ratio": stock_data['pb_ratio'],
|
||||
"ps_ratio": stock_data['ps_ratio'],
|
||||
"dividend_yield": stock_data['dividend_yield'],
|
||||
"total_market_value": stock_data['market_value']
|
||||
},
|
||||
"profitability": {
|
||||
"roe": stock_data['roe'],
|
||||
"gross_margin": stock_data['gross_profit_margin'],
|
||||
"net_margin": stock_data['net_profit_margin']
|
||||
},
|
||||
"growth": {
|
||||
"net_profit_growth": stock_data['net_profit_yoy'],
|
||||
"revenue_growth": stock_data['revenue_yoy']
|
||||
},
|
||||
"solvency": {
|
||||
"debt_to_assets": stock_data['debt_to_assets']
|
||||
},
|
||||
"per_share": {
|
||||
"eps": stock_data.get('eps', 0), # 这个字段在基础数据中没有,需要计算
|
||||
"bps": stock_data['bps'],
|
||||
"ocfps": stock_data['ocfps']
|
||||
}
|
||||
}
|
||||
|
||||
def _format_value_analysis_data_from_info(self, stock_info: dict):
|
||||
"""从股票信息格式化价值投资分析数据"""
|
||||
return {
|
||||
"stock_info": {
|
||||
"code": stock_info['code'],
|
||||
"name": stock_info['name'],
|
||||
"current_price": stock_info['price'],
|
||||
"trade_date": self.stock_dao.get_today_date()
|
||||
},
|
||||
"valuation": {
|
||||
"pe_ratio": stock_info['pe_ratio'],
|
||||
"pb_ratio": stock_info['pb_ratio'],
|
||||
"ps_ratio": stock_info['ps_ratio'],
|
||||
"dividend_yield": stock_info['dividend_yield'],
|
||||
"total_market_value": stock_info['market_value']
|
||||
},
|
||||
"profitability": {
|
||||
"roe": stock_info['roe'],
|
||||
"gross_margin": stock_info['gross_profit_margin'],
|
||||
"net_margin": stock_info['net_profit_margin']
|
||||
},
|
||||
"growth": {
|
||||
"net_profit_growth": stock_info['net_profit_yoy'],
|
||||
"revenue_growth": stock_info['revenue_yoy']
|
||||
},
|
||||
"solvency": {
|
||||
"debt_to_assets": stock_info['debt_to_assets']
|
||||
},
|
||||
"per_share": {
|
||||
"eps": stock_info.get('eps', 0),
|
||||
"bps": stock_info['bps'],
|
||||
"ocfps": stock_info['ocfps']
|
||||
}
|
||||
}
|
||||
678
app/templates/stocks_simple.html
Normal file
678
app/templates/stocks_simple.html
Normal file
@ -0,0 +1,678 @@
|
||||
<!DOCTYPE html>
|
||||
<html lang="zh-CN">
|
||||
<head>
|
||||
<meta charset="UTF-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
||||
<title>股票市场 - 股票监控系统</title>
|
||||
<link href="https://cdn.jsdelivr.net/npm/bootstrap@5.1.3/dist/css/bootstrap.min.css" rel="stylesheet">
|
||||
<link href="https://cdn.jsdelivr.net/npm/bootstrap-icons@1.7.2/font/bootstrap-icons.css" rel="stylesheet">
|
||||
<style>
|
||||
.positive { color: #e74c3c; }
|
||||
.negative { color: #27ae60; }
|
||||
.chart-container {
|
||||
height: 400px;
|
||||
margin: 20px 0;
|
||||
}
|
||||
.filter-section {
|
||||
background-color: #f8f9fa;
|
||||
padding: 20px;
|
||||
border-radius: 8px;
|
||||
margin-bottom: 20px;
|
||||
}
|
||||
.loading {
|
||||
text-align: center;
|
||||
padding: 50px;
|
||||
}
|
||||
.sector-tag {
|
||||
background-color: #e3f2fd;
|
||||
color: #1976d2;
|
||||
padding: 2px 8px;
|
||||
border-radius: 12px;
|
||||
font-size: 12px;
|
||||
margin: 2px;
|
||||
}
|
||||
.spinning {
|
||||
animation: spin 1s linear infinite;
|
||||
}
|
||||
@keyframes spin {
|
||||
0% { transform: rotate(0deg); }
|
||||
100% { transform: rotate(360deg); }
|
||||
}
|
||||
</style>
|
||||
</head>
|
||||
<body>
|
||||
<div id="app">
|
||||
<!-- 导航栏 -->
|
||||
<nav class="navbar navbar-expand-lg navbar-dark bg-primary">
|
||||
<div class="container">
|
||||
<a class="navbar-brand" href="/">
|
||||
<i class="bi bi-graph-up"></i>
|
||||
股票监控系统
|
||||
</a>
|
||||
<div class="navbar-nav">
|
||||
<a class="nav-link active" href="/stocks">股票市场</a>
|
||||
<a class="nav-link" href="/">我的监控</a>
|
||||
<a class="nav-link" href="/market">指数行情</a>
|
||||
</div>
|
||||
</div>
|
||||
</nav>
|
||||
|
||||
<div class="container mt-4">
|
||||
<!-- 市场概览 -->
|
||||
<div class="row mb-4">
|
||||
<div class="col-md-12">
|
||||
<div class="card">
|
||||
<div class="card-header">
|
||||
<h5 class="mb-0">
|
||||
<i class="bi bi-bar-chart"></i>
|
||||
市场概览
|
||||
<button class="btn btn-sm btn-outline-primary float-end" @click="refreshOverview">
|
||||
<i class="bi bi-arrow-clockwise"></i>
|
||||
刷新
|
||||
</button>
|
||||
</h5>
|
||||
</div>
|
||||
<div class="card-body">
|
||||
<div class="row text-center" v-if="overviewLoading">
|
||||
<div class="col-12 loading">
|
||||
<div class="spinner-border" role="status">
|
||||
<span class="visually-hidden">加载中...</span>
|
||||
</div>
|
||||
<p>加载市场数据中...</p>
|
||||
</div>
|
||||
</div>
|
||||
<div class="row" v-else-if="overview.statistics">
|
||||
<div class="col-md-2">
|
||||
<h6>总股票数</h6>
|
||||
<h4>[[ overview.statistics.total_count ]]</h4>
|
||||
</div>
|
||||
<div class="col-md-2">
|
||||
<h6>上涨</h6>
|
||||
<h4 class="positive">[[ overview.statistics.up_count ]]</h4>
|
||||
</div>
|
||||
<div class="col-md-2">
|
||||
<h6>下跌</h6>
|
||||
<h4 class="negative">[[ overview.statistics.down_count ]]</h4>
|
||||
</div>
|
||||
<div class="col-md-3">
|
||||
<h6>成交量</h6>
|
||||
<h4>[[ formatVolume(overview.statistics.total_volume) ]]</h4>
|
||||
</div>
|
||||
<div class="col-md-3">
|
||||
<h6>成交额</h6>
|
||||
<h4>[[ formatAmount(overview.statistics.total_amount) ]]</h4>
|
||||
</div>
|
||||
</div>
|
||||
<div v-else class="text-center py-3">
|
||||
<p class="text-muted">暂无市场数据</p>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- 筛选和搜索 -->
|
||||
<div class="filter-section">
|
||||
<div class="row">
|
||||
<div class="col-md-3">
|
||||
<label class="form-label">搜索股票</label>
|
||||
<input type="text" class="form-control" v-model="searchKeyword"
|
||||
placeholder="股票代码或名称" @input="searchStocks">
|
||||
</div>
|
||||
<div class="col-md-2">
|
||||
<label class="form-label">行业筛选</label>
|
||||
<select class="form-select" v-model="selectedIndustry" @change="filterStocks">
|
||||
<option value="">全部行业</option>
|
||||
<option v-for="industry in industries" :value="industry.industry_code">
|
||||
[[ industry.industry_name ]] ([[ industry.stock_count ]])
|
||||
</option>
|
||||
</select>
|
||||
</div>
|
||||
<div class="col-md-2">
|
||||
<label class="form-label">概念筛选</label>
|
||||
<select class="form-select" v-model="selectedSector" @change="filterStocks">
|
||||
<option value="">全部概念</option>
|
||||
<option v-for="sector in sectors" :value="sector.sector_code">
|
||||
[[ sector.sector_name ]] ([[ sector.stock_count ]])
|
||||
</option>
|
||||
</select>
|
||||
</div>
|
||||
<div class="col-md-3">
|
||||
<label class="form-label">热门排行</label>
|
||||
<select class="form-select" @change="loadHotStocks($event.target.value)">
|
||||
<option value="">选择排行榜</option>
|
||||
<option value="volume">成交量排行榜</option>
|
||||
<option value="amount">成交额排行榜</option>
|
||||
<option value="change">涨幅排行榜</option>
|
||||
</select>
|
||||
</div>
|
||||
<div class="col-md-2">
|
||||
<label class="form-label">数据同步</label>
|
||||
<div>
|
||||
<button class="btn btn-primary btn-sm me-2" @click="syncData" :disabled="syncing">
|
||||
<i class="bi bi-arrow-repeat" v-if="!syncing"></i>
|
||||
<i class="bi bi-arrow-repeat spinning" v-if="syncing"></i>
|
||||
同步数据
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- 股票列表 -->
|
||||
<div class="row">
|
||||
<div class="col-md-12">
|
||||
<div class="card">
|
||||
<div class="card-header d-flex justify-content-between align-items-center">
|
||||
<h5 class="mb-0">
|
||||
<i class="bi bi-list"></i>
|
||||
股票列表
|
||||
<span class="badge bg-secondary ms-2">[[ pagination.total ]] 只</span>
|
||||
</h5>
|
||||
<div>
|
||||
<span class="text-muted">
|
||||
显示第 [[ (pagination.page - 1) * pagination.size + 1 ]] -
|
||||
[[ Math.min(pagination.page * pagination.size, pagination.total) ]] 条
|
||||
</span>
|
||||
</div>
|
||||
</div>
|
||||
<div class="card-body">
|
||||
<div v-if="loading" class="loading">
|
||||
<div class="spinner-border" role="status">
|
||||
<span class="visually-hidden">加载中...</span>
|
||||
</div>
|
||||
<p>加载股票数据中...</p>
|
||||
</div>
|
||||
|
||||
<div v-else-if="stocks.length === 0" class="text-center py-5">
|
||||
<i class="bi bi-inbox display-1 text-muted"></i>
|
||||
<p class="text-muted">暂无股票数据,请先同步数据</p>
|
||||
</div>
|
||||
|
||||
<div v-else>
|
||||
<div class="table-responsive">
|
||||
<table class="table table-hover">
|
||||
<thead>
|
||||
<tr>
|
||||
<th>股票代码</th>
|
||||
<th>股票名称</th>
|
||||
<th>所属行业</th>
|
||||
<th>概念板块</th>
|
||||
<th>最新价</th>
|
||||
<th>涨跌幅</th>
|
||||
<th>成交量</th>
|
||||
<th>操作</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
<tr v-for="stock in stocks" :key="stock.stock_code"
|
||||
@click="showStockDetail(stock.stock_code)"
|
||||
class="stock-card">
|
||||
<td>
|
||||
<strong>[[ stock.stock_code ]]</strong>
|
||||
<small class="text-muted d-block">[[ stock.market ]]</small>
|
||||
</td>
|
||||
<td>[[ stock.stock_name ]]</td>
|
||||
<td>
|
||||
<small class="badge bg-light text-dark">
|
||||
[[ stock.industry_name || '未分类' ]]
|
||||
</small>
|
||||
</td>
|
||||
<td>
|
||||
<span v-for="sector in getSectorNames(stock.sector_names)"
|
||||
:key="sector" class="sector-tag">
|
||||
[[ sector ]]
|
||||
</span>
|
||||
</td>
|
||||
<td v-if="stock.price">
|
||||
<strong>[[ stock.price.toFixed(2) ]]</strong>
|
||||
</td>
|
||||
<td v-else>-</td>
|
||||
<td v-if="stock.change_percent !== null">
|
||||
<span :class="stock.change_percent >= 0 ? 'positive' : 'negative'">
|
||||
[[ (stock.change_percent * 100).toFixed(2) ]]%
|
||||
</span>
|
||||
</td>
|
||||
<td v-else>-</td>
|
||||
<td v-if="stock.volume">
|
||||
[[ formatVolume(stock.volume) ]]
|
||||
</td>
|
||||
<td v-else>-</td>
|
||||
<td>
|
||||
<button class="btn btn-sm btn-outline-primary me-1"
|
||||
@click.stop="showStockDetail(stock.stock_code)">
|
||||
<i class="bi bi-eye"></i>
|
||||
</button>
|
||||
<button class="btn btn-sm btn-outline-success"
|
||||
@click.stop="addToWatchlist(stock.stock_code)">
|
||||
<i class="bi bi-plus"></i>
|
||||
</button>
|
||||
</td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
|
||||
<!-- 分页 -->
|
||||
<nav aria-label="股票列表分页" class="mt-3" v-if="pagination.pages > 1">
|
||||
<ul class="pagination justify-content-center">
|
||||
<li class="page-item" :class="{ disabled: pagination.page <= 1 }">
|
||||
<a class="page-link" href="#" @click.prevent="changePage(pagination.page - 1)">
|
||||
上一页
|
||||
</a>
|
||||
</li>
|
||||
<li v-for="page in pagination.pages" :key="page"
|
||||
class="page-item" :class="{ active: page === pagination.page }">
|
||||
<a class="page-link" href="#" @click.prevent="changePage(page)">
|
||||
[[ page ]]
|
||||
</a>
|
||||
</li>
|
||||
<li class="page-item" :class="{ disabled: pagination.page >= pagination.pages }">
|
||||
<a class="page-link" href="#" @click.prevent="changePage(pagination.page + 1)">
|
||||
下一页
|
||||
</a>
|
||||
</li>
|
||||
</ul>
|
||||
</nav>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- 股票详情模态框 -->
|
||||
<div class="modal fade" id="stockModal" tabindex="-1">
|
||||
<div class="modal-dialog modal-lg">
|
||||
<div class="modal-content">
|
||||
<div class="modal-header">
|
||||
<h5 class="modal-title" v-if="selectedStock">
|
||||
[[ selectedStock.stock_name ]] ([[ selectedStock.stock_code ]])
|
||||
</h5>
|
||||
<button type="button" class="btn-close" data-bs-dismiss="modal"></button>
|
||||
</div>
|
||||
<div class="modal-body">
|
||||
<div v-if="selectedStock" class="row">
|
||||
<div class="col-md-12">
|
||||
<div class="chart-container">
|
||||
<div ref="klineChart"></div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<div v-if="stockDetailLoading" class="text-center">
|
||||
<div class="spinner-border" role="status">
|
||||
<span class="visually-hidden">加载中...</span>
|
||||
</div>
|
||||
<p>加载股票详情中...</p>
|
||||
</div>
|
||||
<div v-else-if="stockDetail && stockDetail.stock_info">
|
||||
<div class="row">
|
||||
<div class="col-md-6">
|
||||
<h6>基本信息</h6>
|
||||
<table class="table table-sm">
|
||||
<tr>
|
||||
<td>股票代码:</td>
|
||||
<td>[[ stockDetail.stock_info.code ]]</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>股票名称:</td>
|
||||
<td>[[ stockDetail.stock_info.name ]]</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>当前价格:</td>
|
||||
<td>[[ stockDetail.stock_info.price ]]元</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>市盈率:</td>
|
||||
<td>[[ stockDetail.stock_info.pe_ratio ]]</td>
|
||||
</tr>
|
||||
</table>
|
||||
</div>
|
||||
<div class="col-md-6">
|
||||
<h6>估值指标</h6>
|
||||
<table class="table table-sm">
|
||||
<tr>
|
||||
<td>市净率:</td>
|
||||
<td>[[ stockDetail.stock_info.pb_ratio ]]</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>市销率:</td>
|
||||
<td>[[ stockDetail.stock_info.ps_ratio ]]</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>股息率:</td>
|
||||
<td>[[ (stockDetail.stock_info.dividend_yield * 100).toFixed(2) ]]%</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>总市值:</td>
|
||||
<td>[[ stockDetail.stock_info.market_value ]]亿元</td>
|
||||
</tr>
|
||||
</table>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="modal-footer">
|
||||
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">关闭</button>
|
||||
<button type="button" class="btn btn-primary"
|
||||
v-if="selectedStock" @click="addToWatchlist(selectedStock.stock_code)">
|
||||
<i class="bi bi-plus"></i>
|
||||
添加到监控
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<script src="https://cdn.jsdelivr.net/npm/bootstrap@5.1.3/dist/js/bootstrap.bundle.min.js"></script>
|
||||
<script src="https://cdn.jsdelivr.net/npm/vue@3.2.31/dist/vue.global.js"></script>
|
||||
<script src="https://cdn.jsdelivr.net/npm/axios@0.27.2/dist/axios.min.js"></script>
|
||||
<script src="https://cdn.jsdelivr.net/npm/echarts@5.4.0/dist/echarts.min.js"></script>
|
||||
<script>
|
||||
const { createApp } = Vue;
|
||||
|
||||
createApp({
|
||||
delimiters: ['[[', ']]'],
|
||||
data() {
|
||||
return {
|
||||
stocks: [],
|
||||
industries: [],
|
||||
sectors: [],
|
||||
overview: {},
|
||||
loading: false,
|
||||
overviewLoading: false,
|
||||
syncing: false,
|
||||
stockDetailLoading: false,
|
||||
selectedStock: null,
|
||||
stockDetail: null,
|
||||
searchKeyword: '',
|
||||
selectedIndustry: '',
|
||||
selectedSector: '',
|
||||
pagination: {
|
||||
page: 1,
|
||||
size: 20,
|
||||
total: 0,
|
||||
pages: 1
|
||||
},
|
||||
searchTimer: null,
|
||||
klineChart: null
|
||||
};
|
||||
},
|
||||
mounted() {
|
||||
this.loadStocks();
|
||||
this.loadIndustries();
|
||||
this.loadSectors();
|
||||
this.loadOverview();
|
||||
},
|
||||
methods: {
|
||||
async loadStocks() {
|
||||
this.loading = true;
|
||||
try {
|
||||
const params = {
|
||||
page: this.pagination.page,
|
||||
size: this.pagination.size
|
||||
};
|
||||
|
||||
if (this.searchKeyword) params.search = this.searchKeyword;
|
||||
if (this.selectedIndustry) params.industry = this.selectedIndustry;
|
||||
if (this.selectedSector) params.sector = this.selectedSector;
|
||||
|
||||
const response = await axios.get('/api/market/stocks', { params });
|
||||
this.stocks = response.data.data;
|
||||
this.pagination.total = response.data.total;
|
||||
this.pagination.pages = response.data.pages;
|
||||
} catch (error) {
|
||||
console.error('加载股票列表失败:', error);
|
||||
} finally {
|
||||
this.loading = false;
|
||||
}
|
||||
},
|
||||
|
||||
async loadIndustries() {
|
||||
try {
|
||||
const response = await axios.get('/api/market/industries');
|
||||
this.industries = response.data.data;
|
||||
} catch (error) {
|
||||
console.error('加载行业列表失败:', error);
|
||||
}
|
||||
},
|
||||
|
||||
async loadSectors() {
|
||||
try {
|
||||
const response = await axios.get('/api/market/sectors');
|
||||
this.sectors = response.data.data;
|
||||
} catch (error) {
|
||||
console.error('加载概念板块失败:', error);
|
||||
}
|
||||
},
|
||||
|
||||
async loadOverview() {
|
||||
this.overviewLoading = true;
|
||||
try {
|
||||
const response = await axios.get('/api/market/overview');
|
||||
this.overview = response.data.data;
|
||||
} catch (error) {
|
||||
console.error('加载市场概览失败:', error);
|
||||
} finally {
|
||||
this.overviewLoading = false;
|
||||
}
|
||||
},
|
||||
|
||||
searchStocks() {
|
||||
clearTimeout(this.searchTimer);
|
||||
this.searchTimer = setTimeout(() => {
|
||||
this.pagination.page = 1;
|
||||
this.loadStocks();
|
||||
}, 500);
|
||||
},
|
||||
|
||||
filterStocks() {
|
||||
this.pagination.page = 1;
|
||||
this.loadStocks();
|
||||
},
|
||||
|
||||
async loadHotStocks(rankType) {
|
||||
if (!rankType) return;
|
||||
|
||||
this.loading = true;
|
||||
try {
|
||||
const response = await axios.get('/api/market/hot-stocks', {
|
||||
params: { rank_type: rankType, limit: 50 }
|
||||
});
|
||||
this.stocks = response.data.data;
|
||||
this.pagination.total = response.data.data.length;
|
||||
this.pagination.pages = 1;
|
||||
} catch (error) {
|
||||
console.error('加载热门股票失败:', error);
|
||||
} finally {
|
||||
this.loading = false;
|
||||
}
|
||||
},
|
||||
|
||||
async syncData() {
|
||||
this.syncing = true;
|
||||
try {
|
||||
const response = await axios.post('/api/market/sync');
|
||||
if (response.data.message) {
|
||||
alert('数据同步成功!');
|
||||
this.loadOverview();
|
||||
this.loadStocks();
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('同步数据失败:', error);
|
||||
alert('数据同步失败: ' + (error.response?.data?.error || error.message));
|
||||
} finally {
|
||||
this.syncing = false;
|
||||
}
|
||||
},
|
||||
|
||||
refreshOverview() {
|
||||
this.loadOverview();
|
||||
},
|
||||
|
||||
changePage(page) {
|
||||
if (page >= 1 && page <= this.pagination.pages) {
|
||||
this.pagination.page = page;
|
||||
this.loadStocks();
|
||||
}
|
||||
},
|
||||
|
||||
formatVolume(volume) {
|
||||
if (!volume) return '-';
|
||||
if (volume >= 100000000) {
|
||||
return (volume / 100000000).toFixed(2) + '亿';
|
||||
} else if (volume >= 10000) {
|
||||
return (volume / 10000).toFixed(2) + '万';
|
||||
}
|
||||
return volume.toString();
|
||||
},
|
||||
|
||||
formatAmount(amount) {
|
||||
if (!amount) return '-';
|
||||
if (amount >= 10000) {
|
||||
return (amount / 10000).toFixed(2) + '亿';
|
||||
}
|
||||
return amount.toFixed(2);
|
||||
},
|
||||
|
||||
getSectorNames(sectorNames) {
|
||||
if (!sectorNames) return [];
|
||||
return sectorNames.split(',');
|
||||
},
|
||||
|
||||
async showStockDetail(stockCode) {
|
||||
this.selectedStock = { stock_code: stockCode };
|
||||
this.stockDetailLoading = true;
|
||||
|
||||
try {
|
||||
const response = await axios.get(`/api/market/stocks/${stockCode}`);
|
||||
this.selectedStock = response.data.data;
|
||||
} catch (error) {
|
||||
console.error('获取股票详情失败:', error);
|
||||
}
|
||||
|
||||
try {
|
||||
const response = await axios.get(`/api/market/stocks/${stockCode}/kline`, {
|
||||
params: { days: 60 }
|
||||
});
|
||||
this.stockDetail = response.data;
|
||||
this.$nextTick(() => {
|
||||
this.renderKlineChart();
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('获取K线数据失败:', error);
|
||||
} finally {
|
||||
this.stockDetailLoading = false;
|
||||
}
|
||||
|
||||
const modal = new bootstrap.Modal(document.getElementById('stockModal'));
|
||||
modal.show();
|
||||
},
|
||||
|
||||
renderKlineChart() {
|
||||
if (!this.$refs.klineChart || !this.stockDetail?.data) return;
|
||||
|
||||
const klineData = this.stockDetail.data;
|
||||
const dates = klineData.map(item => item.date);
|
||||
const prices = klineData.map(item => item.close);
|
||||
const volumes = klineData.map(item => item.volume / 1000);
|
||||
|
||||
if (this.klineChart) {
|
||||
this.klineChart.dispose();
|
||||
}
|
||||
|
||||
this.klineChart = echarts.init(this.$refs.klineChart);
|
||||
|
||||
const option = {
|
||||
title: {
|
||||
text: `${this.selectedStock.stock_name} K线图`,
|
||||
left: 'center'
|
||||
},
|
||||
tooltip: {
|
||||
trigger: 'axis',
|
||||
axisPointer: {
|
||||
type: 'cross'
|
||||
}
|
||||
},
|
||||
legend: {
|
||||
data: ['收盘价', '成交量'],
|
||||
top: 30
|
||||
},
|
||||
xAxis: {
|
||||
type: 'category',
|
||||
data: dates,
|
||||
axisPointer: {
|
||||
type: 'shadow'
|
||||
}
|
||||
},
|
||||
yAxis: [
|
||||
{
|
||||
type: 'value',
|
||||
name: '价格',
|
||||
position: 'left',
|
||||
axisLabel: {
|
||||
formatter: '{value}元'
|
||||
}
|
||||
},
|
||||
{
|
||||
type: 'value',
|
||||
name: '成交量',
|
||||
position: 'right',
|
||||
axisLabel: {
|
||||
formatter: '{value}千手'
|
||||
}
|
||||
}
|
||||
],
|
||||
series: [
|
||||
{
|
||||
name: '收盘价',
|
||||
type: 'line',
|
||||
data: prices,
|
||||
smooth: true,
|
||||
itemStyle: {
|
||||
color: '#1890ff'
|
||||
}
|
||||
},
|
||||
{
|
||||
name: '成交量',
|
||||
type: 'bar',
|
||||
yAxisIndex: 1,
|
||||
data: volumes,
|
||||
itemStyle: {
|
||||
color: '#ffa940'
|
||||
}
|
||||
}
|
||||
],
|
||||
grid: {
|
||||
left: '3%',
|
||||
right: '4%',
|
||||
bottom: '3%',
|
||||
containLabel: true
|
||||
}
|
||||
};
|
||||
|
||||
this.klineChart.setOption(option);
|
||||
},
|
||||
|
||||
async addToWatchlist(stockCode) {
|
||||
try {
|
||||
const formData = new FormData();
|
||||
formData.append('stock_code', stockCode);
|
||||
|
||||
const response = await axios.post('/api/add_watch', formData);
|
||||
if (response.data.status === 'success') {
|
||||
alert('已添加到监控列表!');
|
||||
} else {
|
||||
alert('添加失败: ' + (response.data.error || '未知错误'));
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('添加到监控列表失败:', error);
|
||||
alert('添加失败: ' + (error.response?.data?.error || error.message));
|
||||
}
|
||||
}
|
||||
}
|
||||
}).mount('#app');
|
||||
</script>
|
||||
</body>
|
||||
</html>
|
||||
22
backup/json_backup_20251124_093028/config.json
Normal file
22
backup/json_backup_20251124_093028/config.json
Normal file
@ -0,0 +1,22 @@
|
||||
{
|
||||
"watchlist": {
|
||||
"600179": {
|
||||
"target_market_value": {
|
||||
"min": null,
|
||||
"max": null
|
||||
}
|
||||
},
|
||||
"600589": {
|
||||
"target_market_value": {
|
||||
"min": null,
|
||||
"max": null
|
||||
}
|
||||
},
|
||||
"002065": {
|
||||
"target_market_value": {
|
||||
"min": null,
|
||||
"max": null
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
124
docs/database/apply_extended_schema.py
Normal file
124
docs/database/apply_extended_schema.py
Normal file
@ -0,0 +1,124 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
应用扩展数据库结构脚本
|
||||
执行新的数据库表结构和索引创建
|
||||
"""
|
||||
import sys
|
||||
import os
|
||||
from pathlib import Path
|
||||
|
||||
# 添加项目根目录到Python路径
|
||||
project_root = Path(__file__).parent
|
||||
sys.path.insert(0, str(project_root))
|
||||
|
||||
from app.database import DatabaseManager
|
||||
|
||||
|
||||
def apply_extended_schema():
|
||||
"""应用扩展的数据库结构"""
|
||||
print("正在应用扩展的数据库结构...")
|
||||
|
||||
# 读取SQL脚本
|
||||
schema_file = project_root / "database_schema_extended.sql"
|
||||
if not schema_file.exists():
|
||||
print(f"✗ 扩展数据库结构文件不存在: {schema_file}")
|
||||
return False
|
||||
|
||||
with open(schema_file, 'r', encoding='utf-8') as f:
|
||||
sql_content = f.read()
|
||||
|
||||
db_manager = DatabaseManager()
|
||||
|
||||
try:
|
||||
with db_manager.get_connection() as conn:
|
||||
cursor = conn.cursor()
|
||||
|
||||
# 分割SQL语句并执行
|
||||
statements = [stmt.strip() for stmt in sql_content.split(';') if stmt.strip()]
|
||||
|
||||
for statement in statements:
|
||||
if statement:
|
||||
try:
|
||||
cursor.execute(statement)
|
||||
print(f"✓ 执行成功: {statement[:50]}...")
|
||||
except Exception as e:
|
||||
# 忽略表已存在的错误
|
||||
if "already exists" not in str(e) and "Duplicate" not in str(e):
|
||||
print(f"⚠️ 警告: 执行SQL语句失败: {statement[:50]}... 错误: {e}")
|
||||
|
||||
conn.commit()
|
||||
print("✓ 扩展数据库结构应用成功")
|
||||
cursor.close()
|
||||
|
||||
# 验证新表是否创建成功
|
||||
verify_tables_created()
|
||||
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
print(f"✗ 应用扩展数据库结构失败: {e}")
|
||||
return False
|
||||
|
||||
|
||||
def verify_tables_created():
|
||||
"""验证新表是否创建成功"""
|
||||
print("正在验证新表是否创建成功...")
|
||||
|
||||
db_manager = DatabaseManager()
|
||||
new_tables = [
|
||||
'industries', 'sectors', 'kline_data', 'stock_sector_relations',
|
||||
'market_statistics', 'data_update_tasks', 'hot_stocks'
|
||||
]
|
||||
|
||||
try:
|
||||
with db_manager.get_connection() as conn:
|
||||
cursor = conn.cursor()
|
||||
|
||||
for table_name in new_tables:
|
||||
cursor.execute(f"SHOW TABLES LIKE '{table_name}'")
|
||||
exists = cursor.fetchone()
|
||||
|
||||
if exists:
|
||||
print(f"✓ 表 {table_name} 创建成功")
|
||||
else:
|
||||
print(f"✗ 表 {table_name} 创建失败")
|
||||
|
||||
cursor.close()
|
||||
|
||||
except Exception as e:
|
||||
print(f"✗ 验证表创建失败: {e}")
|
||||
|
||||
|
||||
def main():
|
||||
"""主函数"""
|
||||
print("=" * 60)
|
||||
print("应用扩展数据库结构")
|
||||
print("=" * 60)
|
||||
|
||||
success = apply_extended_schema()
|
||||
|
||||
if success:
|
||||
print("\n" + "=" * 60)
|
||||
print("扩展数据库结构应用完成!")
|
||||
print("=" * 60)
|
||||
print("\n下一步操作:")
|
||||
print("1. 启动应用系统: python run.py")
|
||||
print("2. 访问 http://localhost:8000/api/market/stocks 查看股票列表")
|
||||
print("3. 访问 http://localhost:8000/api/market/sync 同步市场数据")
|
||||
print("=" * 60)
|
||||
else:
|
||||
print("✗ 扩展数据库结构应用失败,请检查错误信息")
|
||||
|
||||
return success
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
try:
|
||||
success = main()
|
||||
sys.exit(0 if success else 1)
|
||||
except KeyboardInterrupt:
|
||||
print("\n数据库结构应用被用户中断")
|
||||
sys.exit(1)
|
||||
except Exception as e:
|
||||
print(f"\n数据库结构应用过程中发生错误: {e}")
|
||||
sys.exit(1)
|
||||
137
docs/database/database_schema.sql
Normal file
137
docs/database/database_schema.sql
Normal file
@ -0,0 +1,137 @@
|
||||
-- 股票监控系统数据库表结构
|
||||
-- Database: stock_monitor
|
||||
|
||||
-- 1. 股票基础信息表
|
||||
CREATE TABLE IF NOT EXISTS stocks (
|
||||
id INT AUTO_INCREMENT PRIMARY KEY,
|
||||
stock_code VARCHAR(10) NOT NULL UNIQUE COMMENT '股票代码',
|
||||
stock_name VARCHAR(50) NOT NULL COMMENT '股票名称',
|
||||
market VARCHAR(10) NOT NULL COMMENT '市场(SH/SZ)',
|
||||
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
|
||||
INDEX idx_stock_code (stock_code)
|
||||
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci COMMENT='股票基础信息表';
|
||||
|
||||
-- 2. 股票实时数据表
|
||||
CREATE TABLE IF NOT EXISTS stock_data (
|
||||
id INT AUTO_INCREMENT PRIMARY KEY,
|
||||
stock_code VARCHAR(10) NOT NULL,
|
||||
data_date DATE NOT NULL COMMENT '数据日期',
|
||||
|
||||
-- 基本信息
|
||||
price DECIMAL(10,3) DEFAULT NULL COMMENT '当前股价',
|
||||
change_percent DECIMAL(8,4) DEFAULT NULL COMMENT '涨跌幅',
|
||||
market_value DECIMAL(12,3) DEFAULT NULL COMMENT '总市值(亿元)',
|
||||
|
||||
-- 估值指标
|
||||
pe_ratio DECIMAL(8,4) DEFAULT NULL COMMENT '市盈率',
|
||||
pb_ratio DECIMAL(8,4) DEFAULT NULL COMMENT '市净率',
|
||||
ps_ratio DECIMAL(8,4) DEFAULT NULL COMMENT '市销率',
|
||||
dividend_yield DECIMAL(8,4) DEFAULT NULL COMMENT '股息率',
|
||||
|
||||
-- 财务指标
|
||||
roe DECIMAL(8,4) DEFAULT NULL COMMENT '净资产收益率',
|
||||
gross_profit_margin DECIMAL(8,4) DEFAULT NULL COMMENT '销售毛利率',
|
||||
net_profit_margin DECIMAL(8,4) DEFAULT NULL COMMENT '销售净利率',
|
||||
debt_to_assets DECIMAL(8,4) DEFAULT NULL COMMENT '资产负债率',
|
||||
revenue_yoy DECIMAL(8,4) DEFAULT NULL COMMENT '营收同比增长率',
|
||||
net_profit_yoy DECIMAL(8,4) DEFAULT NULL COMMENT '净利润同比增长率',
|
||||
bps DECIMAL(8,4) DEFAULT NULL COMMENT '每股净资产',
|
||||
ocfps DECIMAL(8,4) DEFAULT NULL COMMENT '每股经营现金流',
|
||||
|
||||
-- 元数据
|
||||
from_cache BOOLEAN DEFAULT FALSE COMMENT '是否来自缓存',
|
||||
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
|
||||
|
||||
UNIQUE KEY uk_stock_date (stock_code, data_date),
|
||||
INDEX idx_stock_code (stock_code),
|
||||
INDEX idx_data_date (data_date),
|
||||
FOREIGN KEY (stock_code) REFERENCES stocks(stock_code) ON DELETE CASCADE
|
||||
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci COMMENT='股票实时数据表';
|
||||
|
||||
-- 3. 用户监控列表表
|
||||
CREATE TABLE IF NOT EXISTS watchlist (
|
||||
id INT AUTO_INCREMENT PRIMARY KEY,
|
||||
stock_code VARCHAR(10) NOT NULL,
|
||||
target_market_value_min DECIMAL(12,3) DEFAULT NULL COMMENT '目标市值最小值(亿元)',
|
||||
target_market_value_max DECIMAL(12,3) DEFAULT NULL COMMENT '目标市值最大值(亿元)',
|
||||
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
|
||||
|
||||
UNIQUE KEY uk_stock_code (stock_code),
|
||||
FOREIGN KEY (stock_code) REFERENCES stocks(stock_code) ON DELETE CASCADE
|
||||
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci COMMENT='用户监控列表表';
|
||||
|
||||
-- 4. AI分析结果表
|
||||
CREATE TABLE IF NOT EXISTS ai_analysis (
|
||||
id INT AUTO_INCREMENT PRIMARY KEY,
|
||||
stock_code VARCHAR(10) NOT NULL,
|
||||
analysis_type VARCHAR(20) NOT NULL COMMENT '分析类型(stock/dao/daka)',
|
||||
analysis_date DATE NOT NULL COMMENT '分析日期',
|
||||
|
||||
-- 投资建议
|
||||
investment_summary TEXT COMMENT '投资建议摘要',
|
||||
investment_action TEXT COMMENT '建议操作',
|
||||
investment_key_points JSON COMMENT '关键要点',
|
||||
|
||||
-- 详细分析
|
||||
valuation_analysis TEXT COMMENT '估值分析',
|
||||
financial_analysis TEXT COMMENT '财务分析',
|
||||
growth_analysis TEXT COMMENT '成长性分析',
|
||||
risk_analysis TEXT COMMENT '风险分析',
|
||||
|
||||
-- 价格分析
|
||||
reasonable_price_min DECIMAL(10,3) DEFAULT NULL COMMENT '合理价格最小值',
|
||||
reasonable_price_max DECIMAL(10,3) DEFAULT NULL COMMENT '合理价格最大值',
|
||||
target_market_value_min DECIMAL(12,3) DEFAULT NULL COMMENT '目标市值最小值(亿元)',
|
||||
target_market_value_max DECIMAL(12,3) DEFAULT NULL COMMENT '目标市值最大值(亿元)',
|
||||
|
||||
-- 元数据
|
||||
from_cache BOOLEAN DEFAULT FALSE COMMENT '是否来自缓存',
|
||||
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
|
||||
|
||||
UNIQUE KEY uk_stock_type_date (stock_code, analysis_type, analysis_date),
|
||||
INDEX idx_stock_code (stock_code),
|
||||
INDEX idx_analysis_type (analysis_type),
|
||||
INDEX idx_analysis_date (analysis_date),
|
||||
FOREIGN KEY (stock_code) REFERENCES stocks(stock_code) ON DELETE CASCADE
|
||||
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci COMMENT='AI分析结果表';
|
||||
|
||||
-- 5. 系统配置表
|
||||
CREATE TABLE IF NOT EXISTS system_config (
|
||||
id INT AUTO_INCREMENT PRIMARY KEY,
|
||||
config_key VARCHAR(50) NOT NULL UNIQUE COMMENT '配置键',
|
||||
config_value TEXT COMMENT '配置值',
|
||||
config_type VARCHAR(20) DEFAULT 'string' COMMENT '配置类型',
|
||||
description VARCHAR(200) DEFAULT NULL COMMENT '配置描述',
|
||||
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
|
||||
|
||||
INDEX idx_config_key (config_key)
|
||||
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci COMMENT='系统配置表';
|
||||
|
||||
-- 6. 数据更新日志表
|
||||
CREATE TABLE IF NOT EXISTS data_update_log (
|
||||
id INT AUTO_INCREMENT PRIMARY KEY,
|
||||
data_type VARCHAR(20) NOT NULL COMMENT '数据类型',
|
||||
stock_code VARCHAR(10) DEFAULT NULL COMMENT '股票代码',
|
||||
update_status ENUM('success', 'failed', 'partial') NOT NULL COMMENT '更新状态',
|
||||
update_message TEXT COMMENT '更新消息',
|
||||
execution_time DECIMAL(8,3) DEFAULT NULL COMMENT '执行时间(秒)',
|
||||
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||
|
||||
INDEX idx_data_type (data_type),
|
||||
INDEX idx_stock_code (stock_code),
|
||||
INDEX idx_update_status (update_status),
|
||||
INDEX idx_created_at (created_at)
|
||||
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci COMMENT='数据更新日志表';
|
||||
|
||||
-- 插入系统默认配置
|
||||
INSERT INTO system_config (config_key, config_value, config_type, description) VALUES
|
||||
('tushare_api_calls_today', '0', 'integer', 'Tushare API calls today'),
|
||||
('last_data_update_date', '', 'date', 'Last data update date'),
|
||||
('cache_expiration_hours', '24', 'integer', 'Cache expiration time in hours'),
|
||||
('max_watchlist_size', '50', 'integer', 'Maximum watchlist size')
|
||||
ON DUPLICATE KEY UPDATE config_value = VALUES(config_value);
|
||||
213
docs/database/database_schema_extended.sql
Normal file
213
docs/database/database_schema_extended.sql
Normal file
@ -0,0 +1,213 @@
|
||||
-- 股票监控系统扩展数据库结构 (支持全市场股票)
|
||||
-- 在原有表结构基础上添加新功能
|
||||
|
||||
-- 1. 行业分类表
|
||||
CREATE TABLE IF NOT EXISTS industries (
|
||||
id INT AUTO_INCREMENT PRIMARY KEY,
|
||||
industry_code VARCHAR(20) NOT NULL UNIQUE,
|
||||
industry_name VARCHAR(100) NOT NULL,
|
||||
parent_code VARCHAR(20) NULL,
|
||||
level INT DEFAULT 1 COMMENT '行业层级',
|
||||
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
|
||||
INDEX idx_industry_code (industry_code),
|
||||
INDEX idx_parent_code (parent_code)
|
||||
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci;
|
||||
|
||||
-- 2. 概念/板块表
|
||||
CREATE TABLE IF NOT EXISTS sectors (
|
||||
id INT AUTO_INCREMENT PRIMARY KEY,
|
||||
sector_code VARCHAR(20) NOT NULL UNIQUE,
|
||||
sector_name VARCHAR(100) NOT NULL,
|
||||
description TEXT NULL,
|
||||
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
|
||||
INDEX idx_sector_code (sector_code)
|
||||
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci;
|
||||
|
||||
-- 3. 扩展股票表,添加行业和板块信息
|
||||
ALTER TABLE stocks ADD COLUMN IF NOT EXISTS industry_code VARCHAR(20) NULL;
|
||||
ALTER TABLE stocks ADD COLUMN IF NOT EXISTS sector_code VARCHAR(20) NULL;
|
||||
ALTER TABLE stocks ADD COLUMN IF NOT EXISTS list_date DATE NULL COMMENT '上市日期';
|
||||
ALTER TABLE stocks ADD COLUMN IF NOT EXISTS is_active BOOLEAN DEFAULT TRUE COMMENT '是否活跃交易';
|
||||
ALTER TABLE stocks ADD COLUMN IF NOT EXISTS market_type VARCHAR(20) NULL COMMENT '市场类型:主板/创业板/科创板等';
|
||||
|
||||
-- 添加外键约束
|
||||
ALTER TABLE stocks
|
||||
ADD CONSTRAINT fk_stock_industry
|
||||
FOREIGN KEY (industry_code) REFERENCES industries(industry_code)
|
||||
ON DELETE SET NULL ON UPDATE CASCADE;
|
||||
|
||||
ALTER TABLE stocks
|
||||
ADD CONSTRAINT fk_stock_sector
|
||||
FOREIGN KEY (sector_code) REFERENCES sectors(sector_code)
|
||||
ON DELETE SET NULL ON UPDATE CASCADE;
|
||||
|
||||
-- 4. K线数据表 (日K、周K、月K)
|
||||
CREATE TABLE IF NOT EXISTS kline_data (
|
||||
id BIGINT AUTO_INCREMENT PRIMARY KEY,
|
||||
stock_code VARCHAR(10) NOT NULL,
|
||||
kline_type ENUM('daily', 'weekly', 'monthly') NOT NULL DEFAULT 'daily',
|
||||
trade_date DATE NOT NULL,
|
||||
open_price DECIMAL(10,3) NOT NULL,
|
||||
high_price DECIMAL(10,3) NOT NULL,
|
||||
low_price DECIMAL(10,3) NOT NULL,
|
||||
close_price DECIMAL(10,3) NOT NULL,
|
||||
volume BIGINT NOT NULL DEFAULT 0,
|
||||
amount DECIMAL(15,2) NOT NULL DEFAULT 0 COMMENT '成交额(万元)',
|
||||
change_percent DECIMAL(8,4) DEFAULT 0 COMMENT '涨跌幅(%)',
|
||||
change_amount DECIMAL(10,3) DEFAULT 0 COMMENT '涨跌额',
|
||||
turnover_rate DECIMAL(8,4) DEFAULT 0 COMMENT '换手率(%)',
|
||||
pe_ratio DECIMAL(10,2) DEFAULT NULL COMMENT '市盈率',
|
||||
pb_ratio DECIMAL(10,2) DEFAULT NULL COMMENT '市净率',
|
||||
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
|
||||
UNIQUE KEY uk_stock_kline_date (stock_code, kline_type, trade_date),
|
||||
INDEX idx_stock_code (stock_code),
|
||||
INDEX idx_trade_date (trade_date),
|
||||
INDEX idx_kline_type (kline_type),
|
||||
INDEX idx_stock_type_date (stock_code, kline_type, trade_date)
|
||||
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci;
|
||||
|
||||
-- 5. 股票-板块关联表 (支持多个概念板块)
|
||||
CREATE TABLE IF NOT EXISTS stock_sector_relations (
|
||||
id INT AUTO_INCREMENT PRIMARY KEY,
|
||||
stock_code VARCHAR(10) NOT NULL,
|
||||
sector_code VARCHAR(20) NOT NULL,
|
||||
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||
UNIQUE KEY uk_stock_sector (stock_code, sector_code),
|
||||
INDEX idx_stock_code (stock_code),
|
||||
INDEX idx_sector_code (sector_code),
|
||||
FOREIGN KEY (stock_code) REFERENCES stocks(stock_code) ON DELETE CASCADE ON UPDATE CASCADE,
|
||||
FOREIGN KEY (sector_code) REFERENCES sectors(sector_code) ON DELETE CASCADE ON UPDATE CASCADE
|
||||
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci;
|
||||
|
||||
-- 6. 市场行情统计表 (每日统计)
|
||||
CREATE TABLE IF NOT EXISTS market_statistics (
|
||||
id INT AUTO_INCREMENT PRIMARY KEY,
|
||||
stat_date DATE NOT NULL UNIQUE,
|
||||
market_code VARCHAR(10) NOT NULL COMMENT '市场代码: SH/SZ/BJ',
|
||||
total_stocks INT DEFAULT 0 COMMENT '总股票数',
|
||||
up_stocks INT DEFAULT 0 COMMENT '上涨股票数',
|
||||
down_stocks INT DEFAULT 0 COMMENT '下跌股票数',
|
||||
flat_stocks INT DEFAULT 0 COMMENT '平盘股票数',
|
||||
limit_up_stocks INT DEFAULT 0 COMMENT '涨停股票数',
|
||||
limit_down_stocks INT DEFAULT 0 COMMENT '跌停股票数',
|
||||
total_volume BIGINT DEFAULT 0 COMMENT '总成交量',
|
||||
total_amount DECIMAL(15,2) DEFAULT 0 COMMENT '总成交额(亿元)',
|
||||
index_change DECIMAL(8,4) DEFAULT 0 COMMENT '主要指数涨跌幅',
|
||||
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||
INDEX idx_stat_date (stat_date),
|
||||
INDEX idx_market_code (market_code)
|
||||
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci;
|
||||
|
||||
-- 7. 数据更新任务表
|
||||
CREATE TABLE IF NOT EXISTS data_update_tasks (
|
||||
id INT AUTO_INCREMENT PRIMARY KEY,
|
||||
task_name VARCHAR(100) NOT NULL,
|
||||
task_type ENUM('daily_basic', 'kline_data', 'stock_list', 'industry_data') NOT NULL,
|
||||
status ENUM('pending', 'running', 'completed', 'failed') DEFAULT 'pending',
|
||||
start_time TIMESTAMP NULL,
|
||||
end_time TIMESTAMP NULL,
|
||||
processed_count INT DEFAULT 0 COMMENT '已处理数量',
|
||||
total_count INT DEFAULT 0 COMMENT '总数量',
|
||||
error_message TEXT NULL,
|
||||
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
|
||||
INDEX idx_task_type (task_type),
|
||||
INDEX idx_status (status),
|
||||
INDEX idx_created_at (created_at)
|
||||
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci;
|
||||
|
||||
-- 8. 热门股票统计表
|
||||
CREATE TABLE IF NOT EXISTS hot_stocks (
|
||||
id INT AUTO_INCREMENT PRIMARY KEY,
|
||||
stock_code VARCHAR(10) NOT NULL,
|
||||
stat_date DATE NOT NULL,
|
||||
rank_type ENUM('volume', 'amount', 'change', 'turnover') NOT NULL,
|
||||
rank_position INT NOT NULL COMMENT '排名位置',
|
||||
rank_value DECIMAL(15,2) NOT NULL COMMENT '排名值',
|
||||
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||
UNIQUE KEY uk_hot_stock_rank (stock_code, stat_date, rank_type),
|
||||
INDEX idx_stat_date (stat_date),
|
||||
INDEX idx_rank_type (rank_type),
|
||||
FOREIGN KEY (stock_code) REFERENCES stocks(stock_code) ON DELETE CASCADE ON UPDATE CASCADE
|
||||
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci;
|
||||
|
||||
-- 初始化基础行业数据
|
||||
INSERT IGNORE INTO industries (industry_code, industry_name, level) VALUES
|
||||
('A01', '农林牧渔', 1),
|
||||
('B02', '采矿业', 1),
|
||||
('C03', '制造业', 1),
|
||||
('D04', '电力、热力、燃气及水生产和供应业', 1),
|
||||
('E05', '建筑业', 1),
|
||||
('F06', '批发和零售业', 1),
|
||||
('G07', '交通运输、仓储和邮政业', 1),
|
||||
('H08', '住宿和餐饮业', 1),
|
||||
('I09', '信息传输、软件和信息技术服务业', 1),
|
||||
('J10', '金融业', 1),
|
||||
('K11', '房地产业', 1),
|
||||
('L12', '租赁和商务服务业', 1),
|
||||
('M13', '科学研究和技术服务业', 1),
|
||||
('N14', '水利、环境和公共设施管理业', 1),
|
||||
('O15', '居民服务、修理和其他服务业', 1),
|
||||
('P16', '教育', 1),
|
||||
('Q17', '卫生和社会工作', 1),
|
||||
('R18', '文化、体育和娱乐业', 1),
|
||||
('S19', '综合', 1);
|
||||
|
||||
-- 初始化主要概念板块
|
||||
INSERT IGNORE INTO sectors (sector_code, sector_name, description) VALUES
|
||||
('BK0453', '新能源汽车', '新能源汽车产业链相关股票'),
|
||||
('BK0885', '人工智能', '人工智能技术应用相关股票'),
|
||||
('BK0500', '半导体', '半导体芯片设计、制造、封测相关股票'),
|
||||
('BK0476', '医疗器械', '医疗器械设备和服务相关股票'),
|
||||
('BK0727', '军工', '国防军工装备制造相关股票'),
|
||||
('BK0489', '光伏概念', '光伏产业链相关股票'),
|
||||
('BK0729', '5G概念', '第五代移动通信技术相关股票'),
|
||||
('BK0896', '国产软件', '国产软件和信息服务相关股票'),
|
||||
('BK0582', '碳中和', '碳中和发展目标相关股票'),
|
||||
('BK0456', '生物医药', '生物制药和医药研发相关股票'),
|
||||
('BK0857', '数字货币', '数字货币和区块链相关股票'),
|
||||
('BK0735', '新基建', '新型基础设施建设相关股票'),
|
||||
('BK0557', '大消费', '消费升级相关股票'),
|
||||
('BK0726', '国企改革', '国有企业改革相关股票'),
|
||||
('BK0439', '雄安新区', '雄安新区建设相关股票');
|
||||
|
||||
-- 更新 stocks 表的索引
|
||||
ALTER TABLE stocks ADD INDEX IF NOT EXISTS idx_industry_code (industry_code);
|
||||
ALTER TABLE stocks ADD INDEX IF NOT EXISTS idx_sector_code (sector_code);
|
||||
ALTER TABLE stocks ADD INDEX IF NOT EXISTS idx_market_type (market_type);
|
||||
ALTER TABLE stocks ADD INDEX IF NOT EXISTS idx_is_active (is_active);
|
||||
|
||||
-- 为现有数据添加一些示例的行业和板块分类 (如果需要的话)
|
||||
UPDATE stocks
|
||||
SET industry_code = 'I09', market_type = '创业板'
|
||||
WHERE stock_code LIKE '00%' AND stock_code IN ('002065', '002415', '002230');
|
||||
|
||||
UPDATE stocks
|
||||
SET industry_code = 'C03', market_type = '主板'
|
||||
WHERE stock_code LIKE '60%' AND stock_code IN ('600589', '600179', '600000');
|
||||
|
||||
UPDATE stocks
|
||||
SET industry_code = 'C03', market_type = '科创板'
|
||||
WHERE stock_code LIKE '68%';
|
||||
|
||||
-- 创建视图便于查询
|
||||
CREATE OR REPLACE VIEW v_stock_detail AS
|
||||
SELECT
|
||||
s.stock_code,
|
||||
s.stock_name,
|
||||
s.market,
|
||||
s.market_type,
|
||||
i.industry_name,
|
||||
GROUP_CONCAT(sec.sector_name) as sector_names,
|
||||
s.list_date,
|
||||
s.is_active,
|
||||
s.created_at
|
||||
FROM stocks s
|
||||
LEFT JOIN industries i ON s.industry_code = i.industry_code
|
||||
LEFT JOIN stock_sector_relations ssr ON s.stock_code = ssr.stock_code
|
||||
LEFT JOIN sectors sec ON ssr.sector_code = sec.sector_code
|
||||
WHERE s.is_active = TRUE
|
||||
GROUP BY s.stock_code, s.stock_name, s.market, s.market_type, i.industry_name, s.list_date, s.is_active, s.created_at;
|
||||
128
docs/database/database_schema_simple.sql
Normal file
128
docs/database/database_schema_simple.sql
Normal file
@ -0,0 +1,128 @@
|
||||
-- Stock Monitor Database Schema
|
||||
-- Database: stock_monitor
|
||||
|
||||
-- 1. Stocks table
|
||||
CREATE TABLE IF NOT EXISTS stocks (
|
||||
id INT AUTO_INCREMENT PRIMARY KEY,
|
||||
stock_code VARCHAR(10) NOT NULL UNIQUE,
|
||||
stock_name VARCHAR(50) NOT NULL,
|
||||
market VARCHAR(10) NOT NULL,
|
||||
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
|
||||
INDEX idx_stock_code (stock_code)
|
||||
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci;
|
||||
|
||||
-- 2. Stock data table
|
||||
CREATE TABLE IF NOT EXISTS stock_data (
|
||||
id INT AUTO_INCREMENT PRIMARY KEY,
|
||||
stock_code VARCHAR(10) NOT NULL,
|
||||
data_date DATE NOT NULL,
|
||||
|
||||
-- Basic info
|
||||
price DECIMAL(10,3) DEFAULT NULL,
|
||||
change_percent DECIMAL(8,4) DEFAULT NULL,
|
||||
market_value DECIMAL(12,3) DEFAULT NULL,
|
||||
|
||||
-- Valuation metrics
|
||||
pe_ratio DECIMAL(8,4) DEFAULT NULL,
|
||||
pb_ratio DECIMAL(8,4) DEFAULT NULL,
|
||||
ps_ratio DECIMAL(8,4) DEFAULT NULL,
|
||||
dividend_yield DECIMAL(8,4) DEFAULT NULL,
|
||||
|
||||
-- Financial metrics
|
||||
roe DECIMAL(8,4) DEFAULT NULL,
|
||||
gross_profit_margin DECIMAL(8,4) DEFAULT NULL,
|
||||
net_profit_margin DECIMAL(8,4) DEFAULT NULL,
|
||||
debt_to_assets DECIMAL(8,4) DEFAULT NULL,
|
||||
revenue_yoy DECIMAL(8,4) DEFAULT NULL,
|
||||
net_profit_yoy DECIMAL(8,4) DEFAULT NULL,
|
||||
bps DECIMAL(8,4) DEFAULT NULL,
|
||||
ocfps DECIMAL(8,4) DEFAULT NULL,
|
||||
|
||||
-- Metadata
|
||||
from_cache BOOLEAN DEFAULT FALSE,
|
||||
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
|
||||
|
||||
UNIQUE KEY uk_stock_date (stock_code, data_date),
|
||||
INDEX idx_stock_code (stock_code),
|
||||
INDEX idx_data_date (data_date),
|
||||
FOREIGN KEY (stock_code) REFERENCES stocks(stock_code) ON DELETE CASCADE
|
||||
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci;
|
||||
|
||||
-- 3. Watchlist table
|
||||
CREATE TABLE IF NOT EXISTS watchlist (
|
||||
id INT AUTO_INCREMENT PRIMARY KEY,
|
||||
stock_code VARCHAR(10) NOT NULL,
|
||||
target_market_value_min DECIMAL(12,3) DEFAULT NULL,
|
||||
target_market_value_max DECIMAL(12,3) DEFAULT NULL,
|
||||
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
|
||||
|
||||
UNIQUE KEY uk_stock_code (stock_code),
|
||||
FOREIGN KEY (stock_code) REFERENCES stocks(stock_code) ON DELETE CASCADE
|
||||
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci;
|
||||
|
||||
-- 4. AI Analysis table
|
||||
CREATE TABLE IF NOT EXISTS ai_analysis (
|
||||
id INT AUTO_INCREMENT PRIMARY KEY,
|
||||
stock_code VARCHAR(10) NOT NULL,
|
||||
analysis_type VARCHAR(20) NOT NULL,
|
||||
analysis_date DATE NOT NULL,
|
||||
|
||||
-- Investment suggestions
|
||||
investment_summary TEXT,
|
||||
investment_action TEXT,
|
||||
investment_key_points JSON,
|
||||
|
||||
-- Detailed analysis
|
||||
valuation_analysis TEXT,
|
||||
financial_analysis TEXT,
|
||||
growth_analysis TEXT,
|
||||
risk_analysis TEXT,
|
||||
|
||||
-- Price analysis
|
||||
reasonable_price_min DECIMAL(10,3) DEFAULT NULL,
|
||||
reasonable_price_max DECIMAL(10,3) DEFAULT NULL,
|
||||
target_market_value_min DECIMAL(12,3) DEFAULT NULL,
|
||||
target_market_value_max DECIMAL(12,3) DEFAULT NULL,
|
||||
|
||||
-- Metadata
|
||||
from_cache BOOLEAN DEFAULT FALSE,
|
||||
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
|
||||
|
||||
UNIQUE KEY uk_stock_type_date (stock_code, analysis_type, analysis_date),
|
||||
INDEX idx_stock_code (stock_code),
|
||||
INDEX idx_analysis_type (analysis_type),
|
||||
INDEX idx_analysis_date (analysis_date),
|
||||
FOREIGN KEY (stock_code) REFERENCES stocks(stock_code) ON DELETE CASCADE
|
||||
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci;
|
||||
|
||||
-- 5. System config table
|
||||
CREATE TABLE IF NOT EXISTS system_config (
|
||||
id INT AUTO_INCREMENT PRIMARY KEY,
|
||||
config_key VARCHAR(50) NOT NULL UNIQUE,
|
||||
config_value TEXT,
|
||||
config_type VARCHAR(20) DEFAULT 'string',
|
||||
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
|
||||
|
||||
INDEX idx_config_key (config_key)
|
||||
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci;
|
||||
|
||||
-- 6. Data update log table
|
||||
CREATE TABLE IF NOT EXISTS data_update_log (
|
||||
id INT AUTO_INCREMENT PRIMARY KEY,
|
||||
data_type VARCHAR(20) NOT NULL,
|
||||
stock_code VARCHAR(10) DEFAULT NULL,
|
||||
update_status ENUM('success', 'failed', 'partial') NOT NULL,
|
||||
update_message TEXT,
|
||||
execution_time DECIMAL(8,3) DEFAULT NULL,
|
||||
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||
|
||||
INDEX idx_data_type (data_type),
|
||||
INDEX idx_stock_code (stock_code),
|
||||
INDEX idx_update_status (update_status),
|
||||
INDEX idx_created_at (created_at)
|
||||
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci;
|
||||
159
docs/database/init_database.py
Normal file
159
docs/database/init_database.py
Normal file
@ -0,0 +1,159 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
数据库初始化脚本
|
||||
创建数据库表结构并初始化基础数据
|
||||
"""
|
||||
import sys
|
||||
import os
|
||||
from pathlib import Path
|
||||
|
||||
# 添加项目根目录到Python路径
|
||||
project_root = Path(__file__).parent
|
||||
sys.path.insert(0, str(project_root))
|
||||
|
||||
from app.database import DatabaseManager
|
||||
from app.config import Config
|
||||
|
||||
|
||||
def create_database():
|
||||
"""创建数据库"""
|
||||
print("正在创建数据库...")
|
||||
|
||||
# 创建数据库管理器,连接到MySQL服务器(不指定数据库)
|
||||
db_manager = DatabaseManager()
|
||||
|
||||
try:
|
||||
with db_manager.get_connection() as conn:
|
||||
cursor = conn.cursor()
|
||||
|
||||
# 创建数据库
|
||||
cursor.execute(f"CREATE DATABASE IF NOT EXISTS {Config.MYSQL_DATABASE} CHARACTER SET utf8mb4 COLLATE utf8mb4_unicode_ci")
|
||||
print(f"✓ 数据库 {Config.MYSQL_DATABASE} 创建成功")
|
||||
|
||||
cursor.close()
|
||||
|
||||
except Exception as e:
|
||||
print(f"✗ 创建数据库失败: {e}")
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
|
||||
def create_tables():
|
||||
"""创建数据表"""
|
||||
print("正在创建数据表...")
|
||||
|
||||
# 读取SQL脚本
|
||||
schema_file = project_root / "database_schema.sql"
|
||||
if not schema_file.exists():
|
||||
print(f"✗ 数据库表结构文件不存在: {schema_file}")
|
||||
return False
|
||||
|
||||
with open(schema_file, 'r', encoding='utf-8') as f:
|
||||
sql_content = f.read()
|
||||
|
||||
db_manager = DatabaseManager()
|
||||
|
||||
try:
|
||||
with db_manager.get_connection() as conn:
|
||||
cursor = conn.cursor()
|
||||
|
||||
# 分割SQL语句并执行
|
||||
statements = [stmt.strip() for stmt in sql_content.split(';') if stmt.strip()]
|
||||
|
||||
for statement in statements:
|
||||
if statement:
|
||||
try:
|
||||
cursor.execute(statement)
|
||||
except Exception as e:
|
||||
# 忽略表已存在的错误
|
||||
if "already exists" not in str(e):
|
||||
print(f"警告: 执行SQL语句失败: {statement[:50]}... 错误: {e}")
|
||||
|
||||
conn.commit()
|
||||
print("✓ 数据表创建成功")
|
||||
cursor.close()
|
||||
|
||||
except Exception as e:
|
||||
print(f"✗ 创建数据表失败: {e}")
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
|
||||
def test_connection():
|
||||
"""测试数据库连接"""
|
||||
print("正在测试数据库连接...")
|
||||
|
||||
try:
|
||||
from app.dao import StockDAO, WatchlistDAO, AIAnalysisDAO, ConfigDAO
|
||||
|
||||
# 测试各个DAO
|
||||
stock_dao = StockDAO()
|
||||
watchlist_dao = WatchlistDAO()
|
||||
ai_dao = AIAnalysisDAO()
|
||||
config_dao = ConfigDAO()
|
||||
|
||||
# 获取数据库状态
|
||||
stock_count = stock_dao.get_stock_count()
|
||||
watchlist_count = watchlist_dao.get_watchlist_count()
|
||||
ai_count = ai_dao.get_analysis_count()
|
||||
|
||||
print(f"✓ 数据库连接成功")
|
||||
print(f" - 股票数量: {stock_count}")
|
||||
print(f" - 监控列表: {watchlist_count}")
|
||||
print(f" - AI分析: {ai_count}")
|
||||
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
print(f"✗ 数据库连接失败: {e}")
|
||||
return False
|
||||
|
||||
|
||||
def main():
|
||||
"""主函数"""
|
||||
print("=" * 60)
|
||||
print("股票监控系统数据库初始化")
|
||||
print("=" * 60)
|
||||
print(f"数据库主机: {Config.MYSQL_HOST}:{Config.MYSQL_PORT}")
|
||||
print(f"数据库名称: {Config.MYSQL_DATABASE}")
|
||||
print(f"数据库用户: {Config.MYSQL_USER}")
|
||||
print("=" * 60)
|
||||
|
||||
# 1. 创建数据库
|
||||
if not create_database():
|
||||
print("数据库创建失败,初始化终止")
|
||||
return False
|
||||
|
||||
# 2. 创建数据表
|
||||
if not create_tables():
|
||||
print("数据表创建失败,初始化终止")
|
||||
return False
|
||||
|
||||
# 3. 测试连接
|
||||
if not test_connection():
|
||||
print("数据库连接测试失败,初始化终止")
|
||||
return False
|
||||
|
||||
print("\n" + "=" * 60)
|
||||
print("数据库初始化完成!")
|
||||
print("=" * 60)
|
||||
print("\n下一步操作:")
|
||||
print("1. 运行数据迁移脚本: python migrate_to_database.py")
|
||||
print("2. 启动应用系统")
|
||||
print("=" * 60)
|
||||
|
||||
return True
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
try:
|
||||
success = main()
|
||||
sys.exit(0 if success else 1)
|
||||
except KeyboardInterrupt:
|
||||
print("\n数据库初始化被用户中断")
|
||||
sys.exit(1)
|
||||
except Exception as e:
|
||||
print(f"\n数据库初始化过程中发生错误: {e}")
|
||||
sys.exit(1)
|
||||
325
docs/database/migrate_to_database.py
Normal file
325
docs/database/migrate_to_database.py
Normal file
@ -0,0 +1,325 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
数据迁移脚本:将JSON文件数据迁移到MySQL数据库
|
||||
"""
|
||||
import json
|
||||
import os
|
||||
import sys
|
||||
from datetime import datetime
|
||||
from pathlib import Path
|
||||
|
||||
# 添加项目根目录到Python路径
|
||||
project_root = Path(__file__).parent
|
||||
sys.path.insert(0, str(project_root))
|
||||
|
||||
from app.dao import StockDAO, WatchlistDAO, AIAnalysisDAO, ConfigDAO
|
||||
from app.config import Config
|
||||
|
||||
|
||||
class DataMigration:
|
||||
def __init__(self):
|
||||
self.stock_dao = StockDAO()
|
||||
self.watchlist_dao = WatchlistDAO()
|
||||
self.ai_dao = AIAnalysisDAO()
|
||||
self.config_dao = ConfigDAO()
|
||||
|
||||
# JSON文件路径
|
||||
self.config_file = Config.CONFIG_FILE
|
||||
self.cache_file = os.path.join(Config.BASE_DIR, "stock_cache.json")
|
||||
self.ai_cache_dir = os.path.join(Config.BASE_DIR, "ai_stock_analysis")
|
||||
self.dao_cache_dir = os.path.join(Config.BASE_DIR, "dao_analysis")
|
||||
self.daka_cache_dir = os.path.join(Config.BASE_DIR, "daka_analysis")
|
||||
|
||||
print("数据迁移工具初始化完成")
|
||||
print(f"配置文件: {self.config_file}")
|
||||
print(f"股票缓存文件: {self.cache_file}")
|
||||
print(f"AI分析缓存目录: {self.ai_cache_dir}")
|
||||
|
||||
def migrate_watchlist(self):
|
||||
"""迁移监控列表"""
|
||||
print("\n开始迁移监控列表...")
|
||||
|
||||
if not os.path.exists(self.config_file):
|
||||
print("配置文件不存在,跳过监控列表迁移")
|
||||
return 0
|
||||
|
||||
try:
|
||||
with open(self.config_file, 'r', encoding='utf-8') as f:
|
||||
config_data = json.load(f)
|
||||
|
||||
watchlist = config_data.get('watchlist', {})
|
||||
migrated_count = 0
|
||||
|
||||
for stock_code, targets in watchlist.items():
|
||||
try:
|
||||
target_min = targets.get('target_market_value', {}).get('min')
|
||||
target_max = targets.get('target_market_value', {}).get('max')
|
||||
|
||||
success = self.watchlist_dao.add_to_watchlist(
|
||||
stock_code, target_min, target_max
|
||||
)
|
||||
|
||||
if success:
|
||||
migrated_count += 1
|
||||
print(f"✓ 迁移监控股票: {stock_code}")
|
||||
else:
|
||||
print(f"✗ 迁移失败: {stock_code}")
|
||||
|
||||
except Exception as e:
|
||||
print(f"✗ 迁移股票 {stock_code} 失败: {e}")
|
||||
|
||||
print(f"监控列表迁移完成,共迁移 {migrated_count} 支股票")
|
||||
return migrated_count
|
||||
|
||||
except Exception as e:
|
||||
print(f"监控列表迁移失败: {e}")
|
||||
return 0
|
||||
|
||||
def migrate_stock_cache(self):
|
||||
"""迁移股票缓存数据"""
|
||||
print("\n开始迁移股票缓存数据...")
|
||||
|
||||
if not os.path.exists(self.cache_file):
|
||||
print("股票缓存文件不存在,跳过缓存数据迁移")
|
||||
return 0
|
||||
|
||||
try:
|
||||
with open(self.cache_file, 'r', encoding='utf-8') as f:
|
||||
cache_data = json.load(f)
|
||||
|
||||
migrated_count = 0
|
||||
|
||||
for stock_code, data in cache_data.items():
|
||||
try:
|
||||
stock_info = data.get('data', {}).get('stock_info', {})
|
||||
timestamp = data.get('timestamp', datetime.now().strftime('%Y-%m-%d'))
|
||||
|
||||
# 迁移股票信息
|
||||
if stock_info:
|
||||
stock_name = stock_info.get('name', '')
|
||||
market = 'SH' if stock_code.startswith('6') else 'SZ'
|
||||
|
||||
# 添加或更新股票基础信息
|
||||
self.stock_dao.add_or_update_stock(stock_code, stock_name, market)
|
||||
|
||||
# 保存股票数据
|
||||
success = self.stock_dao.save_stock_data(
|
||||
stock_code, stock_info, timestamp
|
||||
)
|
||||
|
||||
if success:
|
||||
migrated_count += 1
|
||||
print(f"✓ 迁移股票数据: {stock_code} ({timestamp})")
|
||||
else:
|
||||
print(f"✗ 迁移失败: {stock_code}")
|
||||
|
||||
except Exception as e:
|
||||
print(f"✗ 迁移股票数据 {stock_code} 失败: {e}")
|
||||
|
||||
print(f"股票缓存数据迁移完成,共迁移 {migrated_count} 条记录")
|
||||
return migrated_count
|
||||
|
||||
except Exception as e:
|
||||
print(f"股票缓存数据迁移失败: {e}")
|
||||
return 0
|
||||
|
||||
def migrate_ai_analysis(self, cache_dir: str, analysis_type: str):
|
||||
"""迁移AI分析数据"""
|
||||
if not os.path.exists(cache_dir):
|
||||
print(f"分析缓存目录不存在: {cache_dir}")
|
||||
return 0
|
||||
|
||||
migrated_count = 0
|
||||
|
||||
try:
|
||||
for filename in os.listdir(cache_dir):
|
||||
if filename.endswith('.json'):
|
||||
stock_code = filename[:-5] # 移除.json后缀
|
||||
file_path = os.path.join(cache_dir, filename)
|
||||
|
||||
try:
|
||||
with open(file_path, 'r', encoding='utf-8') as f:
|
||||
analysis_data = json.load(f)
|
||||
|
||||
# 获取文件修改时间作为分析日期
|
||||
file_mtime = os.path.getmtime(file_path)
|
||||
analysis_date = datetime.fromtimestamp(file_mtime).strftime('%Y-%m-%d')
|
||||
|
||||
# 保存到数据库
|
||||
success = self.ai_dao.save_analysis(
|
||||
stock_code, analysis_type, analysis_data, analysis_date
|
||||
)
|
||||
|
||||
if success:
|
||||
migrated_count += 1
|
||||
print(f"✓ 迁移{analysis_type}分析: {stock_code}")
|
||||
else:
|
||||
print(f"✗ 迁移失败: {stock_code}")
|
||||
|
||||
except Exception as e:
|
||||
print(f"✗ 迁移{analysis_type}分析 {stock_code} 失败: {e}")
|
||||
|
||||
return migrated_count
|
||||
|
||||
except Exception as e:
|
||||
print(f"{analysis_type}分析数据迁移失败: {e}")
|
||||
return 0
|
||||
|
||||
def migrate_all_ai_analysis(self):
|
||||
"""迁移所有AI分析数据"""
|
||||
print("\n开始迁移AI分析数据...")
|
||||
|
||||
total_migrated = 0
|
||||
|
||||
# 迁移标准AI分析
|
||||
print("迁移标准AI分析...")
|
||||
count = self.migrate_ai_analysis(self.ai_cache_dir, 'stock')
|
||||
total_migrated += count
|
||||
print(f"标准AI分析迁移完成,共 {count} 条")
|
||||
|
||||
# 迁移道德经分析
|
||||
print("\n迁移道德经分析...")
|
||||
count = self.migrate_ai_analysis(self.dao_cache_dir, 'dao')
|
||||
total_migrated += count
|
||||
print(f"道德经分析迁移完成,共 {count} 条")
|
||||
|
||||
# 迁移大咖分析
|
||||
print("\n迁移大咖分析...")
|
||||
count = self.migrate_ai_analysis(self.daka_cache_dir, 'daka')
|
||||
total_migrated += count
|
||||
print(f"大咖分析迁移完成,共 {count} 条")
|
||||
|
||||
print(f"\nAI分析数据迁移完成,共迁移 {total_migrated} 条记录")
|
||||
return total_migrated
|
||||
|
||||
def backup_json_files(self):
|
||||
"""备份JSON文件"""
|
||||
print("\n备份JSON文件...")
|
||||
|
||||
backup_dir = os.path.join(Config.BASE_DIR, f"json_backup_{datetime.now().strftime('%Y%m%d_%H%M%S')}")
|
||||
os.makedirs(backup_dir, exist_ok=True)
|
||||
|
||||
files_to_backup = [
|
||||
(self.config_file, "config.json"),
|
||||
(self.cache_file, "stock_cache.json")
|
||||
]
|
||||
|
||||
directories_to_backup = [
|
||||
(self.ai_cache_dir, "ai_stock_analysis"),
|
||||
(self.dao_cache_dir, "dao_analysis"),
|
||||
(self.daka_cache_dir, "daka_analysis")
|
||||
]
|
||||
|
||||
import shutil
|
||||
|
||||
# 备份文件
|
||||
for file_path, filename in files_to_backup:
|
||||
if os.path.exists(file_path):
|
||||
shutil.copy2(file_path, os.path.join(backup_dir, filename))
|
||||
print(f"✓ 备份文件: {filename}")
|
||||
|
||||
# 备份目录
|
||||
for dir_path, dirname in directories_to_backup:
|
||||
if os.path.exists(dir_path):
|
||||
shutil.copytree(dir_path, os.path.join(backup_dir, dirname), dirs_exist_ok=True)
|
||||
print(f"✓ 备份目录: {dirname}")
|
||||
|
||||
print(f"JSON文件备份完成,备份位置: {backup_dir}")
|
||||
return backup_dir
|
||||
|
||||
def run_full_migration(self):
|
||||
"""执行完整数据迁移"""
|
||||
print("=" * 60)
|
||||
print("开始从JSON到数据库的完整数据迁移")
|
||||
print("=" * 60)
|
||||
|
||||
# 备份JSON文件
|
||||
backup_dir = self.backup_json_files()
|
||||
|
||||
# 执行迁移
|
||||
try:
|
||||
watchlist_count = self.migrate_watchlist()
|
||||
stock_cache_count = self.migrate_stock_cache()
|
||||
ai_analysis_count = self.migrate_all_ai_analysis()
|
||||
|
||||
print("\n" + "=" * 60)
|
||||
print("数据迁移完成!")
|
||||
print("=" * 60)
|
||||
print(f"监控列表: {watchlist_count} 条")
|
||||
print(f"股票缓存数据: {stock_cache_count} 条")
|
||||
print(f"AI分析数据: {ai_analysis_count} 条")
|
||||
print(f"JSON文件备份: {backup_dir}")
|
||||
print("=" * 60)
|
||||
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
print(f"\n数据迁移过程中发生错误: {e}")
|
||||
print("请检查数据库连接和权限设置")
|
||||
return False
|
||||
|
||||
def verify_migration(self):
|
||||
"""验证迁移结果"""
|
||||
print("\n验证迁移结果...")
|
||||
|
||||
try:
|
||||
# 检查股票数据
|
||||
stock_count = self.stock_dao.get_stock_count()
|
||||
print(f"数据库中股票数量: {stock_count}")
|
||||
|
||||
# 检查监控列表
|
||||
watchlist_count = self.watchlist_dao.get_watchlist_count()
|
||||
print(f"监控列表股票数量: {watchlist_count}")
|
||||
|
||||
# 检查AI分析数据
|
||||
ai_analysis_count = self.ai_dao.get_analysis_count()
|
||||
print(f"AI分析记录数量: {ai_analysis_count}")
|
||||
|
||||
# 检查日期范围
|
||||
date_range = self.stock_dao.get_data_date_range()
|
||||
if date_range:
|
||||
print(f"数据日期范围: {date_range.get('min_date')} 至 {date_range.get('max_date')}")
|
||||
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
print(f"验证迁移结果失败: {e}")
|
||||
return False
|
||||
|
||||
|
||||
def main():
|
||||
"""主函数"""
|
||||
migration = DataMigration()
|
||||
|
||||
print("数据迁移工具")
|
||||
print("1. 执行完整迁移")
|
||||
print("2. 仅迁移监控列表")
|
||||
print("3. 仅迁移股票缓存")
|
||||
print("4. 仅迁移AI分析")
|
||||
print("5. 验证迁移结果")
|
||||
|
||||
try:
|
||||
choice = input("\n请选择操作 (1-5): ").strip()
|
||||
|
||||
if choice == '1':
|
||||
migration.run_full_migration()
|
||||
migration.verify_migration()
|
||||
elif choice == '2':
|
||||
migration.migrate_watchlist()
|
||||
elif choice == '3':
|
||||
migration.migrate_stock_cache()
|
||||
elif choice == '4':
|
||||
migration.migrate_all_ai_analysis()
|
||||
elif choice == '5':
|
||||
migration.verify_migration()
|
||||
else:
|
||||
print("无效选择")
|
||||
|
||||
except KeyboardInterrupt:
|
||||
print("\n\n迁移被用户中断")
|
||||
except Exception as e:
|
||||
print(f"\n迁移过程中发生错误: {e}")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
239
docs/database/test_database.py
Normal file
239
docs/database/test_database.py
Normal file
@ -0,0 +1,239 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
数据库功能测试脚本
|
||||
"""
|
||||
import sys
|
||||
import os
|
||||
from pathlib import Path
|
||||
|
||||
# 添加项目根目录到Python路径
|
||||
project_root = Path(__file__).parent
|
||||
sys.path.insert(0, str(project_root))
|
||||
|
||||
from app.dao import StockDAO, WatchlistDAO, AIAnalysisDAO, ConfigDAO
|
||||
from app.services.stock_service_db import StockServiceDB
|
||||
from app.services.ai_analysis_service_db import AIAnalysisServiceDB
|
||||
|
||||
|
||||
def test_database_connection():
|
||||
"""测试数据库连接"""
|
||||
print("1. 测试数据库连接...")
|
||||
|
||||
try:
|
||||
from app.database import DatabaseManager
|
||||
db_manager = DatabaseManager()
|
||||
|
||||
with db_manager.get_cursor() as cursor:
|
||||
cursor.execute("SELECT 1 as test")
|
||||
result = cursor.fetchone()
|
||||
|
||||
if result and result['test'] == 1:
|
||||
print(" ✓ 数据库连接正常")
|
||||
return True
|
||||
else:
|
||||
print(" ✗ 数据库连接异常")
|
||||
return False
|
||||
|
||||
except Exception as e:
|
||||
print(f" ✗ 数据库连接失败: {e}")
|
||||
return False
|
||||
|
||||
|
||||
def test_dao_functions():
|
||||
"""测试DAO层功能"""
|
||||
print("\n2. 测试DAO层功能...")
|
||||
|
||||
try:
|
||||
# 测试各个DAO
|
||||
stock_dao = StockDAO()
|
||||
watchlist_dao = WatchlistDAO()
|
||||
ai_dao = AIAnalysisDAO()
|
||||
config_dao = ConfigDAO()
|
||||
|
||||
# 测试基础查询
|
||||
stock_count = stock_dao.get_stock_count()
|
||||
watchlist_count = watchlist_dao.get_watchlist_count()
|
||||
ai_count = ai_dao.get_analysis_count()
|
||||
|
||||
print(f" ✓ 股票数量: {stock_count}")
|
||||
print(f" ✓ 监控列表: {watchlist_count}")
|
||||
print(f" ✓ AI分析: {ai_count}")
|
||||
|
||||
# 测试配置读写
|
||||
config_dao.set_config('test_key', 'test_value', 'string')
|
||||
test_value = config_dao.get_config('test_key')
|
||||
|
||||
if test_value == 'test_value':
|
||||
print(" ✓ 配置读写正常")
|
||||
else:
|
||||
print(" ✗ 配置读写异常")
|
||||
return False
|
||||
|
||||
# 清理测试数据
|
||||
config_dao.delete_config('test_key')
|
||||
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
print(f" ✗ DAO层测试失败: {e}")
|
||||
return False
|
||||
|
||||
|
||||
def test_stock_service():
|
||||
"""测试股票服务"""
|
||||
print("\n3. 测试股票服务...")
|
||||
|
||||
try:
|
||||
stock_service = StockServiceDB()
|
||||
|
||||
# 测试监控列表功能
|
||||
watchlist = stock_service.get_watchlist()
|
||||
print(f" ✓ 获取监控列表: {len(watchlist)} 项")
|
||||
|
||||
if watchlist:
|
||||
# 测试获取股票信息(使用第一只股票)
|
||||
stock_code = watchlist[0].get('stock_code') or watchlist[0].get('code')
|
||||
if stock_code:
|
||||
print(f" ✓ 测试股票: {stock_code}")
|
||||
|
||||
# 测试获取股票信息
|
||||
stock_info = stock_service.get_stock_info(stock_code)
|
||||
if 'error' not in stock_info:
|
||||
print(" ✓ 股票信息获取正常")
|
||||
else:
|
||||
print(f" ✗ 股票信息获取失败: {stock_info.get('error')}")
|
||||
return False
|
||||
|
||||
# 测试指数信息
|
||||
index_info = stock_service.get_index_info()
|
||||
if index_info:
|
||||
print(f" ✓ 指数信息获取正常: {len(index_info)} 个指数")
|
||||
else:
|
||||
print(" ✗ 指数信息获取失败")
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
print(f" ✗ 股票服务测试失败: {e}")
|
||||
return False
|
||||
|
||||
|
||||
def test_ai_service():
|
||||
"""测试AI分析服务"""
|
||||
print("\n4. 测试AI分析服务...")
|
||||
|
||||
try:
|
||||
ai_service = AIAnalysisServiceDB()
|
||||
stock_service = StockServiceDB()
|
||||
|
||||
# 获取一只测试股票
|
||||
watchlist = stock_service.get_watchlist()
|
||||
if not watchlist:
|
||||
print(" ⚠️ 监控列表为空,跳过AI服务测试")
|
||||
return True
|
||||
|
||||
stock_code = watchlist[0].get('stock_code') or watchlist[0].get('code')
|
||||
|
||||
# 测试价值分析数据获取
|
||||
value_data = stock_service.get_value_analysis_data(stock_code)
|
||||
if 'error' not in value_data:
|
||||
print(" ✓ 价值分析数据获取正常")
|
||||
else:
|
||||
print(f" ✗ 价值分析数据获取失败: {value_data.get('error')}")
|
||||
return False
|
||||
|
||||
# 测试AI分析历史记录
|
||||
history = ai_service.get_analysis_history(stock_code, 'stock', 7)
|
||||
print(f" ✓ AI分析历史记录: {len(history)} 条")
|
||||
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
print(f" ✗ AI服务测试失败: {e}")
|
||||
return False
|
||||
|
||||
|
||||
def test_api_compatibility():
|
||||
"""测试API兼容性"""
|
||||
print("\n5. 测试API兼容性...")
|
||||
|
||||
try:
|
||||
from app.services.stock_service_db import StockServiceDB
|
||||
from app.services.ai_analysis_service_db import AIAnalysisServiceDB
|
||||
|
||||
# 测试服务实例化
|
||||
stock_service = StockServiceDB()
|
||||
ai_service = AIAnalysisServiceDB()
|
||||
|
||||
print(" ✓ 数据库服务实例化正常")
|
||||
|
||||
# 测试方法是否存在
|
||||
required_methods = [
|
||||
'get_stock_info', 'get_watchlist', 'add_watch', 'remove_watch',
|
||||
'update_target', 'get_index_info'
|
||||
]
|
||||
|
||||
for method in required_methods:
|
||||
if hasattr(stock_service, method):
|
||||
print(f" ✓ 方法存在: {method}")
|
||||
else:
|
||||
print(f" ✗ 方法缺失: {method}")
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
print(f" ✗ API兼容性测试失败: {e}")
|
||||
return False
|
||||
|
||||
|
||||
def main():
|
||||
"""主测试函数"""
|
||||
print("=" * 60)
|
||||
print("股票监控系统数据库功能测试")
|
||||
print("=" * 60)
|
||||
|
||||
tests = [
|
||||
test_database_connection,
|
||||
test_dao_functions,
|
||||
test_stock_service,
|
||||
test_ai_service,
|
||||
test_api_compatibility
|
||||
]
|
||||
|
||||
passed = 0
|
||||
total = len(tests)
|
||||
|
||||
for test in tests:
|
||||
try:
|
||||
if test():
|
||||
passed += 1
|
||||
except Exception as e:
|
||||
print(f" 测试异常: {e}")
|
||||
|
||||
print("\n" + "=" * 60)
|
||||
print(f"测试完成!")
|
||||
print(f"通过: {passed}/{total}")
|
||||
print("=" * 60)
|
||||
|
||||
if passed == total:
|
||||
print("🎉 所有测试通过!数据库迁移成功!")
|
||||
print("\n系统现在可以正常使用数据库存储。")
|
||||
print("如需回滚到JSON文件存储,请参考 DATABASE_MIGRATION_GUIDE.md")
|
||||
else:
|
||||
print("⚠️ 部分测试未通过,请检查配置和数据库连接。")
|
||||
|
||||
return passed == total
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
try:
|
||||
success = main()
|
||||
sys.exit(0 if success else 1)
|
||||
except KeyboardInterrupt:
|
||||
print("\n测试被用户中断")
|
||||
sys.exit(1)
|
||||
except Exception as e:
|
||||
print(f"\n测试过程中发生错误: {e}")
|
||||
sys.exit(1)
|
||||
182
docs/guides/DATABASE_MIGRATION_GUIDE.md
Normal file
182
docs/guides/DATABASE_MIGRATION_GUIDE.md
Normal file
@ -0,0 +1,182 @@
|
||||
# 股票监控系统数据库迁移指南
|
||||
|
||||
本指南将帮助您将股票监控系统从JSON文件存储迁移到MySQL数据库存储。
|
||||
|
||||
## 迁移概述
|
||||
|
||||
### 迁移前状态
|
||||
- 监控列表:存储在 `config.json` 文件
|
||||
- 股票缓存:存储在 `stock_cache.json` 文件
|
||||
- AI分析结果:存储在各个目录的JSON文件中
|
||||
|
||||
### 迁移后状态
|
||||
- 所有数据统一存储在 `stock_monitor` MySQL数据库中
|
||||
- 支持数据查询、历史记录、缓存管理
|
||||
- 更好的性能和数据一致性
|
||||
|
||||
## 数据库结构
|
||||
|
||||
### 主要数据表
|
||||
1. **stocks** - 股票基础信息表
|
||||
2. **stock_data** - 股票实时数据表
|
||||
3. **watchlist** - 监控列表表
|
||||
4. **ai_analysis** - AI分析结果表
|
||||
5. **system_config** - 系统配置表
|
||||
6. **data_update_log** - 数据更新日志表
|
||||
|
||||
详细的表结构请参考 `database_schema.sql` 文件。
|
||||
|
||||
## 迁移步骤
|
||||
|
||||
### 1. 准备数据库环境
|
||||
|
||||
确保MySQL服务已启动,并且数据库连接配置正确。
|
||||
|
||||
在 `app/config.py` 中检查数据库配置:
|
||||
```python
|
||||
MYSQL_HOST = os.getenv('MYSQL_HOST', 'localhost')
|
||||
MYSQL_PORT = int(os.getenv('MYSQL_PORT', 3306))
|
||||
MYSQL_USER = os.getenv('MYSQL_USER', 'root')
|
||||
MYSQL_PASSWORD = os.getenv('MYSQL_PASSWORD', 'password')
|
||||
MYSQL_DATABASE = os.getenv('MYSQL_DATABASE', 'stock_monitor')
|
||||
```
|
||||
|
||||
### 2. 初始化数据库
|
||||
|
||||
运行数据库初始化脚本:
|
||||
|
||||
```bash
|
||||
python init_database.py
|
||||
```
|
||||
|
||||
该脚本将:
|
||||
- 创建 `stock_monitor` 数据库
|
||||
- 创建所有必要的数据表
|
||||
- 插入系统默认配置
|
||||
- 测试数据库连接
|
||||
|
||||
### 3. 执行数据迁移
|
||||
|
||||
运行数据迁移脚本:
|
||||
|
||||
```bash
|
||||
python migrate_to_database.py
|
||||
```
|
||||
|
||||
该脚本提供以下选项:
|
||||
1. 执行完整迁移
|
||||
2. 仅迁移监控列表
|
||||
3. 仅迁移股票缓存
|
||||
4. 仅迁移AI分析
|
||||
5. 验证迁移结果
|
||||
|
||||
建议选择选项1执行完整迁移。
|
||||
|
||||
### 4. 备份原始数据
|
||||
|
||||
迁移脚本会自动创建原始JSON文件的备份,备份位置格式为:
|
||||
```
|
||||
json_backup_YYYYMMDD_HHMMSS/
|
||||
```
|
||||
|
||||
## 系统更改
|
||||
|
||||
### 新增文件
|
||||
- `app/database.py` - 数据库连接管理
|
||||
- `app/dao/` - 数据访问对象层
|
||||
- `app/services/stock_service_db.py` - 基于数据库的股票服务
|
||||
- `app/services/ai_analysis_service_db.py` - 基于数据库的AI分析服务
|
||||
- `init_database.py` - 数据库初始化脚本
|
||||
- `migrate_to_database.py` - 数据迁移脚本
|
||||
|
||||
### 修改文件
|
||||
- `app/api/stock_routes.py` - 更新为使用数据库服务
|
||||
- `app/config.py` - 添加数据库配置
|
||||
|
||||
## 使用说明
|
||||
|
||||
### 启动系统
|
||||
迁移完成后,正常启动系统即可:
|
||||
```bash
|
||||
uvicorn app.main:app --reload
|
||||
```
|
||||
|
||||
### 验证迁移
|
||||
1. 检查监控列表是否完整迁移
|
||||
2. 验证股票数据是否正确加载
|
||||
3. 确认AI分析功能正常工作
|
||||
4. 测试新增和删除功能
|
||||
|
||||
### 功能对比
|
||||
| 功能 | 迁移前 | 迁移后 |
|
||||
|------|--------|--------|
|
||||
| 监控列表 | JSON文件 | 数据库表 |
|
||||
| 股票数据缓存 | JSON文件 | 数据库表 |
|
||||
| AI分析结果 | 文件缓存 | 数据库表 |
|
||||
| 历史记录 | 无 | 支持 |
|
||||
| 数据查询 | 文件解析 | SQL查询 |
|
||||
| 性能 | 文件I/O | 数据库优化 |
|
||||
| 数据备份 | 手动 | 数据库工具 |
|
||||
|
||||
## 维护说明
|
||||
|
||||
### 数据备份
|
||||
定期备份数据库:
|
||||
```bash
|
||||
mysqldump -u root -p stock_monitor > backup_$(date +%Y%m%d).sql
|
||||
```
|
||||
|
||||
### 日志监控
|
||||
查看 `data_update_log` 表了解数据更新状态:
|
||||
```sql
|
||||
SELECT * FROM data_update_log
|
||||
WHERE update_status = 'failed'
|
||||
ORDER BY created_at DESC;
|
||||
```
|
||||
|
||||
### 性能优化
|
||||
1. 定期清理过期的股票数据
|
||||
2. 监控数据库连接池
|
||||
3. 优化查询索引
|
||||
|
||||
## 回滚方案
|
||||
|
||||
如果需要回滚到JSON文件存储:
|
||||
|
||||
1. 恢复备份的JSON文件
|
||||
2. 修改 `app/api/stock_routes.py` 中的导入:
|
||||
```python
|
||||
# 注释掉数据库服务
|
||||
# from app.services.stock_service_db import StockServiceDB
|
||||
# from app.services.ai_analysis_service_db import AIAnalysisServiceDB
|
||||
|
||||
# 恢复原始服务
|
||||
from app.services.stock_service import StockService
|
||||
from app.services.ai_analysis_service import AIAnalysisService
|
||||
```
|
||||
|
||||
## 常见问题
|
||||
|
||||
### Q: 迁移过程中断怎么办?
|
||||
A: 迁移脚本支持断点续传,重新运行会跳过已迁移的数据。
|
||||
|
||||
### Q: 数据库连接失败怎么解决?
|
||||
A: 检查MySQL服务是否启动,连接配置是否正确,防火墙设置是否允许连接。
|
||||
|
||||
### Q: 迁移后数据不一致怎么办?
|
||||
A: 使用验证功能检查迁移结果,或者重新执行迁移覆盖现有数据。
|
||||
|
||||
### Q: 如何清理测试数据?
|
||||
A: 可以清空数据库表重新迁移,或者使用SQL语句删除特定数据。
|
||||
|
||||
## 技术支持
|
||||
|
||||
如果遇到迁移问题,请检查:
|
||||
1. MySQL服务状态
|
||||
2. 数据库权限设置
|
||||
3. 网络连接
|
||||
4. 日志文件中的错误信息
|
||||
|
||||
---
|
||||
|
||||
**迁移完成后,您的股票监控系统将具备更好的性能、可靠性和扩展性!**
|
||||
237
docs/guides/NEW_FEATURES_GUIDE.md
Normal file
237
docs/guides/NEW_FEATURES_GUIDE.md
Normal file
@ -0,0 +1,237 @@
|
||||
# 股票监控系统新功能使用指南
|
||||
|
||||
## 🎉 新功能概览
|
||||
|
||||
我们已经成功为股票监控系统添加了以下全市场股票功能:
|
||||
|
||||
### ✅ 已实现的功能
|
||||
|
||||
1. **📊 全市场股票数据**
|
||||
- 获取所有A股股票的基础信息
|
||||
- 支持按行业、概念板块分类浏览
|
||||
- 实时股票搜索和筛选
|
||||
|
||||
2. **📈 K线数据管理**
|
||||
- 日K、周K、月K线数据存储
|
||||
- 支持历史K线数据查询
|
||||
- K线图表可视化展示
|
||||
|
||||
3. **🤖 自动化定时任务**
|
||||
- 每日自动更新股票列表
|
||||
- 自动更新K线数据
|
||||
- 市场统计数据计算
|
||||
- 数据清理和维护
|
||||
|
||||
4. **🖥️ 前端用户界面**
|
||||
- 股票市场浏览页面
|
||||
- 实时市场概览
|
||||
- 股票详情和K线图表
|
||||
- 行业和概念筛选
|
||||
|
||||
## 🚀 快速开始
|
||||
|
||||
### 1. 应用数据库结构
|
||||
```bash
|
||||
python apply_extended_schema.py
|
||||
```
|
||||
此脚本会创建新的数据库表结构,支持全市场股票数据。
|
||||
|
||||
### 2. 启动系统
|
||||
```bash
|
||||
python run.py
|
||||
```
|
||||
系统启动后,定时任务会自动开始运行。
|
||||
|
||||
### 3. 访问新功能
|
||||
- **股票市场页面**: http://localhost:8000/stocks
|
||||
- **原有监控页面**: http://localhost:8000/
|
||||
- **指数行情页面**: http://localhost:8000/market
|
||||
|
||||
## 📋 主要功能说明
|
||||
|
||||
### 股票市场页面 (/stocks)
|
||||
|
||||
#### 市场概览
|
||||
- 显示全市场涨跌统计
|
||||
- 总成交量和成交额
|
||||
- 实时刷新市场数据
|
||||
|
||||
#### 股票浏览
|
||||
- **搜索功能**: 支持股票代码和名称搜索
|
||||
- **行业筛选**: 按行业分类浏览股票
|
||||
- **概念筛选**: 按概念板块浏览股票
|
||||
- **热门排行**: 成交量、成交额、涨幅排行榜
|
||||
- **分页显示**: 高效显示大量股票数据
|
||||
|
||||
#### 股票详情
|
||||
- 点击股票查看详细信息
|
||||
- K线图表展示(60天历史数据)
|
||||
- 基本面指标和估值数据
|
||||
- 一键添加到监控列表
|
||||
|
||||
### API接口
|
||||
|
||||
#### 股票数据接口
|
||||
```bash
|
||||
# 获取所有股票列表(支持分页和筛选)
|
||||
GET /api/market/stocks?page=1&size=50&industry=I09&search=银行
|
||||
|
||||
# 获取股票详细信息
|
||||
GET /api/market/stocks/000001
|
||||
|
||||
# 获取K线数据
|
||||
GET /api/market/stocks/000001/kline?kline_type=daily&days=30
|
||||
|
||||
# 获取行业列表
|
||||
GET /api/market/industries
|
||||
|
||||
# 获取概念板块列表
|
||||
GET /api/market/sectors
|
||||
```
|
||||
|
||||
#### 市场统计接口
|
||||
```bash
|
||||
# 获取市场概览
|
||||
GET /api/market/overview
|
||||
|
||||
# 获取热门股票排行榜
|
||||
GET /api/market/hot-stocks?rank_type=volume&limit=20
|
||||
|
||||
# 同步市场数据
|
||||
POST /api/market/sync
|
||||
```
|
||||
|
||||
#### 定时任务接口
|
||||
```bash
|
||||
# 手动执行任务
|
||||
POST /api/market/tasks/update_stock_list
|
||||
POST /api/market/tasks/update_daily_kline
|
||||
|
||||
# 获取任务执行状态
|
||||
GET /api/market/tasks/status?days=7
|
||||
```
|
||||
|
||||
## 🔄 定时任务说明
|
||||
|
||||
系统内置了以下自动任务:
|
||||
|
||||
### 每日任务
|
||||
- **09:00** - 更新股票列表(每周一)
|
||||
- **09:30** - 更新当日K线数据
|
||||
- **16:00** - 计算市场统计数据
|
||||
- **20:00** - 更新监控列表数据
|
||||
|
||||
### 每周任务
|
||||
- **周日02:00** - 清理旧数据(保留6个月)
|
||||
|
||||
### 数据更新策略
|
||||
- 股票列表:每周一更新一次
|
||||
- K线数据:每个交易日更新
|
||||
- 市场统计:每个交易日计算
|
||||
- 数据清理:每周日凌晨执行
|
||||
|
||||
## 📊 数据库表结构
|
||||
|
||||
### 新增表结构
|
||||
|
||||
1. **industries** - 行业分类表
|
||||
2. **sectors** - 概念板块表
|
||||
3. **kline_data** - K线数据表
|
||||
4. **stock_sector_relations** - 股票-板块关联表
|
||||
5. **market_statistics** - 市场统计表
|
||||
6. **data_update_tasks** - 任务执行记录表
|
||||
7. **hot_stocks** - 热门股票统计表
|
||||
|
||||
### 扩展表结构
|
||||
|
||||
- **stocks** 表增加了行业、板块、市场类型等字段
|
||||
|
||||
## 🎨 前端技术栈
|
||||
|
||||
- **Vue.js 3** - 前端框架
|
||||
- **Bootstrap 5** - UI组件库
|
||||
- **ECharts** - 图表库
|
||||
- **Axios** - HTTP客户端
|
||||
|
||||
## 🛠️ 使用技巧
|
||||
|
||||
### 1. 数据同步
|
||||
首次使用时,点击"同步数据"按钮获取最新的股票数据。
|
||||
|
||||
### 2. 股票筛选
|
||||
- 使用搜索框快速定位特定股票
|
||||
- 通过行业和概念筛选发现投资机会
|
||||
- 查看热门排行榜了解市场热点
|
||||
|
||||
### 3. K线图表
|
||||
- 点击股票查看详细的K线图表
|
||||
- 支持日K、周K、月K不同周期
|
||||
- 结合成交量分析价格走势
|
||||
|
||||
### 4. 监控管理
|
||||
- 在股票详情页面一键添加到监控列表
|
||||
- 原有监控功能完全保持兼容
|
||||
- AI分析功能支持新添加的股票
|
||||
|
||||
## 📈 系统性能
|
||||
|
||||
### 优化策略
|
||||
- 数据库索引优化查询性能
|
||||
- 分页加载减少内存占用
|
||||
- 缓存机制减少API调用
|
||||
- 异步任务处理提升响应速度
|
||||
|
||||
### 容量规划
|
||||
- 支持5000+股票实时数据
|
||||
- 历史K线数据按需清理
|
||||
- 任务执行状态监控和日志
|
||||
|
||||
## 🔧 故障排除
|
||||
|
||||
### 常见问题
|
||||
|
||||
1. **股票列表为空**
|
||||
- 检查是否已执行数据同步
|
||||
- 确认数据库连接正常
|
||||
- 查看任务执行状态
|
||||
|
||||
2. **K线图表不显示**
|
||||
- 确认股票代码正确
|
||||
- 检查网络连接
|
||||
- 查看浏览器控制台错误信息
|
||||
|
||||
3. **数据更新不及时**
|
||||
- 检查定时任务是否正常运行
|
||||
- 确认Tushare API配额充足
|
||||
- 查看任务执行日志
|
||||
|
||||
### 日志查看
|
||||
```bash
|
||||
# 查看任务执行状态
|
||||
curl http://localhost:8000/api/market/tasks/status
|
||||
|
||||
# 手动触发数据同步
|
||||
curl -X POST http://localhost:8000/api/market/sync
|
||||
```
|
||||
|
||||
## 🎯 下一步优化方向
|
||||
|
||||
1. **技术分析指标**: 添加更多技术分析指标
|
||||
2. **实时推送**: WebSocket实时数据推送
|
||||
3. **数据导出**: 支持Excel、CSV格式导出
|
||||
4. **用户个性化**: 自定义筛选条件和提醒
|
||||
5. **移动端适配**: 响应式设计优化
|
||||
|
||||
---
|
||||
|
||||
## 🎊 总结
|
||||
|
||||
新功能完全兼容原有系统,在保持监控列表功能的同时,大大扩展了系统的数据覆盖面和分析能力。现在您可以:
|
||||
|
||||
- 🔍 **浏览全市场5000+股票**
|
||||
- 📊 **查看实时K线图表**
|
||||
- 🏭 **按行业概念分类筛选**
|
||||
- 🔥 **追踪市场热点股票**
|
||||
- ⏰ **享受全自动数据更新**
|
||||
|
||||
祝您投资顺利!🚀
|
||||
@ -1,7 +0,0 @@
|
||||
FROM python:3.10-slim
|
||||
WORKDIR /app
|
||||
COPY requirements.txt .
|
||||
RUN pip install --no-cache-dir -r requirements.txt # 禁用缓存减小体积
|
||||
COPY . .
|
||||
EXPOSE 8000
|
||||
CMD ["python3","run.py"]
|
||||
4
run.py
4
run.py
@ -8,5 +8,5 @@ if __name__ == "__main__":
|
||||
port=8000, # 修改为8000端口
|
||||
reload=True, # 启用热重载
|
||||
log_level="debug", # 设置日志级别为debug
|
||||
workers=1 # 开发模式使用单个worker
|
||||
)
|
||||
workers=1 # 开发模式使用单个workervg
|
||||
)
|
||||
2376
stock_cache.json
2376
stock_cache.json
File diff suppressed because it is too large
Load Diff
193
项目文档.txt
193
项目文档.txt
@ -1,193 +0,0 @@
|
||||
# 价值投资盯盘系统项目文档
|
||||
|
||||
## 项目概述
|
||||
|
||||
**项目名称**:价值投资盯盘系统
|
||||
**开发者**:张艺杰
|
||||
**项目类型**:A股智能股票分析与监控平台
|
||||
**技术栈**:Python + FastAPI + Bootstrap + ECharts
|
||||
|
||||
## 系统功能
|
||||
|
||||
### 1. 核心功能
|
||||
|
||||
1. **股票监控**
|
||||
- 实时股票行情监控
|
||||
- 自定义市值目标区间
|
||||
- 多维度指标展示
|
||||
- 涨跌幅实时更新
|
||||
|
||||
2. **指数行情**
|
||||
- 主要指数实时展示
|
||||
- K线图可视化
|
||||
- 涨跌幅实时更新
|
||||
|
||||
3. **公司详情分析**
|
||||
- 公司基本信息
|
||||
- 财务指标分析
|
||||
- 股东结构分析
|
||||
- AI智能分析
|
||||
|
||||
### 2. 具体指标监控
|
||||
|
||||
#### 2.1 基础指标
|
||||
- 股票代码和名称
|
||||
- 现价和涨跌幅
|
||||
- 市值监控
|
||||
- 目标区间对比
|
||||
|
||||
#### 2.2 估值指标
|
||||
- 市盈率(PE)
|
||||
- 市净率(PB)
|
||||
- 市销率(PS)
|
||||
- 股息率
|
||||
|
||||
#### 2.3 财务指标
|
||||
- ROE(净资产收益率)
|
||||
- 毛利率
|
||||
- 净利率
|
||||
- 资产负债率
|
||||
- 净利润增长率
|
||||
- 每股净资产
|
||||
- 每股经营现金流
|
||||
|
||||
### 3. AI分析功能
|
||||
|
||||
1. **投资建议**
|
||||
- 总体建议
|
||||
- 建议操作
|
||||
- 关注重点
|
||||
|
||||
2. **价格分析**
|
||||
- 合理价格区间
|
||||
- 目标市值区间
|
||||
|
||||
3. **多维度分析**
|
||||
- 估值分析
|
||||
- 财务健康状况
|
||||
- 成长潜力
|
||||
- 风险评估
|
||||
|
||||
## 技术实现
|
||||
|
||||
### 1. 后端架构
|
||||
|
||||
1. **Web框架**
|
||||
- FastAPI作为主要Web框架
|
||||
- Uvicorn作为ASGI服务器
|
||||
|
||||
2. **数据源集成**
|
||||
- Tushare API接口
|
||||
|
||||
3. **数据处理**
|
||||
- Pandas进行数据分析
|
||||
- NumPy进行数值计算
|
||||
|
||||
### 2. 前端实现
|
||||
|
||||
1. **UI框架**
|
||||
- Bootstrap 5.1.3
|
||||
- 响应式设计
|
||||
|
||||
2. **数据可视化**
|
||||
- ECharts 5.4.3
|
||||
- 动态K线图表
|
||||
|
||||
3. **交互设计**
|
||||
- AJAX异步数据更新
|
||||
- 实时数据刷新
|
||||
- 模态框展示详情
|
||||
|
||||
### 3. 数据存储
|
||||
|
||||
1. **配置存储**
|
||||
- JSON文件存储监控列表
|
||||
- 配置文件自动管理
|
||||
|
||||
2. **缓存机制**
|
||||
- 行情数据缓存
|
||||
- 智能更新策略
|
||||
|
||||
## 部署要求
|
||||
|
||||
### 1. 系统要求
|
||||
- Python 3.8+
|
||||
- 8GB+ RAM
|
||||
- 现代浏览器支持
|
||||
|
||||
### 2. 依赖安装
|
||||
```bash
|
||||
pip install -r requirements.txt
|
||||
```
|
||||
|
||||
### 3. 配置说明
|
||||
- 需配置Tushare API Token
|
||||
- 配置端口默认为8000
|
||||
- 支持热重载
|
||||
|
||||
## 使用说明
|
||||
|
||||
### 1. 启动系统
|
||||
```bash
|
||||
python run.py
|
||||
```
|
||||
|
||||
### 2. 访问系统
|
||||
- 浏览器访问:`http://localhost:8000`
|
||||
|
||||
### 3. 基本操作
|
||||
1. 添加监控股票
|
||||
- 输入6位股票代码
|
||||
- 设置目标市值区间
|
||||
|
||||
2. 查看股票详情
|
||||
- 点击股票名称查看详细信息
|
||||
- 查看AI分析报告
|
||||
|
||||
3. 管理监控列表
|
||||
- 删除不需要的股票
|
||||
- 强制刷新数据
|
||||
|
||||
## 安全性考虑
|
||||
|
||||
1. **数据安全**
|
||||
- API Token安全存储
|
||||
- 敏感信息加密
|
||||
|
||||
2. **访问控制**
|
||||
- 请求频率限制
|
||||
- 错误处理机制
|
||||
|
||||
## 后续优化方向
|
||||
|
||||
1. **功能扩展**
|
||||
- 增加更多技术指标
|
||||
- 添加自定义告警功能
|
||||
- 支持多维度筛选
|
||||
|
||||
2. **性能优化**
|
||||
- 优化数据缓存机制
|
||||
- 提升响应速度
|
||||
- 减少资源占用
|
||||
|
||||
3. **用户体验**
|
||||
- 增加自定义主题
|
||||
- 优化移动端显示
|
||||
- 添加更多图表类型
|
||||
|
||||
## 维护说明
|
||||
|
||||
1. **日常维护**
|
||||
- 定期更新依赖
|
||||
- 检查API可用性
|
||||
- 优化数据缓存
|
||||
|
||||
2. **问题处理**
|
||||
- 日志监控
|
||||
- 异常处理
|
||||
- 性能监控
|
||||
|
||||
## 版权信息
|
||||
|
||||
版权所有 © 2024 张艺杰
|
||||
保留所有权利
|
||||
Loading…
Reference in New Issue
Block a user