diff --git a/README.md b/README.md
index 0c52901..62ba55b 100644
--- a/README.md
+++ b/README.md
@@ -22,6 +22,7 @@ DeepSearcher combines cutting-edge LLMs (OpenAI o3, Qwen3, DeepSeek, Grok 4, Cla
- **Flexible Embedding Options**: Compatible with multiple embedding models for optimal selection.
- **Multiple LLM Support**: Supports DeepSeek, OpenAI, and other large models for intelligent Q&A and content generation.
- **Document Loader**: Supports local file loading, with web crawling capabilities under development.
+- **Web Interface**: Provides a user-friendly web interface for loading documents and performing queries.
---
@@ -92,6 +93,33 @@ load_from_website(urls=website_url)
# Query
result = query("Write a report about xxx.") # Your question here
```
+
+### Web Interface
+
+DeepSearcher now includes a web interface for easier interaction with the system. To use it:
+
+1. Start the service:
+```shell
+python main.py
+```
+
+2. Open your browser and navigate to http://localhost:8000
+
+3. Use the web interface to:
+ - Load local files by specifying their paths
+ - Load website content by providing URLs
+ - Perform queries against the loaded data
+
+You can also enable CORS support for development:
+```shell
+python main.py --enable-cors
+```
+
+To customize the host and port:
+```shell
+python main.py --host 127.0.0.1 --port 8080
+```
+
### Configuration Details:
#### LLM Configuration
diff --git a/deepsearcher/backend/templates/index.html b/deepsearcher/backend/templates/index.html
new file mode 100644
index 0000000..a0ef791
--- /dev/null
+++ b/deepsearcher/backend/templates/index.html
@@ -0,0 +1,469 @@
+
+
+