Spaces:
Build error
Build error
timho102003
commited on
Commit
•
0481dd5
1
Parent(s):
e74c73c
Deploy news_pynecone!
Browse files- Dockerfile +1 -1
- README.md +4 -110
- default_app/default_app.py +0 -58
- {default_app → news_pynecone}/__init__.py +0 -0
- news_pynecone/home.py +402 -0
- news_pynecone/news_pynecone.py +38 -0
- pcconfig_docker.py +2 -2
Dockerfile
CHANGED
@@ -49,7 +49,7 @@ RUN python -c "from pathlib import Path; from pynecone.utils import setup_fronte
|
|
49 |
|
50 |
# Copy existing project
|
51 |
COPY --chown=pn pcconfig_docker.py pcconfig.py
|
52 |
-
COPY --chown=pn
|
53 |
COPY --chown=pn assets assets
|
54 |
|
55 |
# Re-init frontend (if new deps from project)
|
|
|
49 |
|
50 |
# Copy existing project
|
51 |
COPY --chown=pn pcconfig_docker.py pcconfig.py
|
52 |
+
COPY --chown=pn news_pynecone news_pynecone
|
53 |
COPY --chown=pn assets assets
|
54 |
|
55 |
# Re-init frontend (if new deps from project)
|
README.md
CHANGED
@@ -1,5 +1,5 @@
|
|
1 |
---
|
2 |
-
title:
|
3 |
emoji: 🐿️
|
4 |
colorFrom: yellow
|
5 |
colorTo: blue
|
@@ -9,114 +9,8 @@ pinned: false
|
|
9 |
license: mit
|
10 |
duplicated_from: Wauplin/pynecone-on-spaces-template
|
11 |
---
|
|
|
12 |
|
13 |
-
|
14 |
|
15 |
-
|
16 |
-
|
17 |
-
If you encounter any issue, please let us know by creating a discussion in our [Community tab](https://huggingface.co/spaces/Wauplin/pynecone-on-spaces-template/discussions).
|
18 |
-
|
19 |
-
### Instructions:
|
20 |
-
|
21 |
-
|
22 |
-
#### 1. Install deployment tool
|
23 |
-
|
24 |
-
|
25 |
-
Install the deployment script [from source](https://github.com/Wauplin/pynecone-on-spaces):
|
26 |
-
|
27 |
-
```
|
28 |
-
pip install git+https://github.com/Wauplin/pynecone-on-spaces
|
29 |
-
```
|
30 |
-
|
31 |
-
#### 2. (optional) Login to Huggingface
|
32 |
-
|
33 |
-
|
34 |
-
If you don't already have an account, please [register](https://huggingface.co/join) to HF.
|
35 |
-
|
36 |
-
You will next need to [setup a token](https://huggingface.co/settings/tokens) with `WRITE` permissions.
|
37 |
-
|
38 |
-
Once you have it, login your machine to your account by running the following command in your terminal:
|
39 |
-
|
40 |
-
```bash
|
41 |
-
huggingface-cli login
|
42 |
-
```
|
43 |
-
|
44 |
-
#### 3. Deploy 🚀
|
45 |
-
|
46 |
-
|
47 |
-
Simply deploy your pynecone app:
|
48 |
-
|
49 |
-
```bash
|
50 |
-
huggingface-pynecone deploy
|
51 |
-
```
|
52 |
-
|
53 |
-
**Output:**
|
54 |
-
```txt
|
55 |
-
New Space created: https://huggingface.co/spaces/Wauplin/clock
|
56 |
-
README.md file not found locally. Adapting the existing remote one.
|
57 |
-
Setup pcconfig.py
|
58 |
-
Setup Dockerfile
|
59 |
-
Sync requirements.txt
|
60 |
-
Sync assets/ folder
|
61 |
-
Sync app folder
|
62 |
-
Your app has been successfully deployed!
|
63 |
-
Check it out: https://Wauplin-clock.hf.space
|
64 |
-
```
|
65 |
-
|
66 |
-
The name of your repository will be derived from the app name defined in your `pcconfig.py` file.
|
67 |
-
|
68 |
-
For more flexibility, you can specify some arguments:
|
69 |
-
|
70 |
-
```bash
|
71 |
-
huggingface-pynecone deploy --pcconfig_path="path/to/pcconfig.py" --private --token="hf_xxx"
|
72 |
-
```
|
73 |
-
|
74 |
-
#### 4. Sync changes
|
75 |
-
|
76 |
-
|
77 |
-
Your project is now hosted in a versioned repository on Huggingface.
|
78 |
-
|
79 |
-
You can decide to clone the repository and push your changes as you would do with any project.
|
80 |
-
Please make sure to not break anything in `README.md` (metadata), `Dockerfile` and `pcconfig_docker.py`.
|
81 |
-
|
82 |
-
Otherwise, you can make changes to your existing project and push your changes using the same command:
|
83 |
-
|
84 |
-
```bash
|
85 |
-
huggingface-pynecone deploy
|
86 |
-
```
|
87 |
-
|
88 |
-
**Output:**
|
89 |
-
|
90 |
-
```txt
|
91 |
-
Repo ID: Wauplin/clock
|
92 |
-
Space already exist: https://huggingface.co/spaces/Wauplin/clock
|
93 |
-
Space already configured. Do not update README.md, Dockerfile and pcconfig.py
|
94 |
-
Sync requirements.txt
|
95 |
-
Sync assets/ folder
|
96 |
-
Sync app folder
|
97 |
-
Your app has been successfully deployed!
|
98 |
-
Check it out: https://Wauplin-clock.hf.space
|
99 |
-
```
|
100 |
-
|
101 |
-
|
102 |
-
### What tweaks are used to deploy you apps?
|
103 |
-
|
104 |
-
|
105 |
-
Some tweaks are necessary to make your Pynecone deployed on HF Spaces.
|
106 |
-
Everything is taken care of by the `huggingface-pynecone` CLI tool.
|
107 |
-
|
108 |
-
List of small tweaks:
|
109 |
-
- Running in a Docker Space means only 1 open port:
|
110 |
-
- We use nginx proxy to expose both frontend and backend on same port 4444
|
111 |
-
- Port 3000 (frontend) is redirected to localhost:4444/
|
112 |
-
- Port 8000 (backend) is redirected to localhost:4444/pynecone-backend
|
113 |
-
- Port 4444 is set in several files:
|
114 |
-
- `README.md`: set `app_port` parameter
|
115 |
-
- `nginx.conf`: define which port to listen
|
116 |
-
- `pcconfig.py`: monkey-patch pynecone to make it think the backend is running on port 8000 even though a different `api_url` is set
|
117 |
-
- `pc.Config` configuration:
|
118 |
-
- set env as `PROD`
|
119 |
-
- set `api_url` to the Space url + pynecone-backend path. Example: `"https://wauplin-pynecone-counter.hf.space/pynecone-backend"`
|
120 |
-
- set `port` to 3000 (frontend port)
|
121 |
-
- set db_url to `"sqlite:///pynecone.db"` but DB is not persisted across when the Space is reloaded (i.e.: not really a database yet)
|
122 |
-
- Use a `run.sh` script to start nginx before running pynecone server
|
|
|
1 |
---
|
2 |
+
title: News_pynecone
|
3 |
emoji: 🐿️
|
4 |
colorFrom: yellow
|
5 |
colorTo: blue
|
|
|
9 |
license: mit
|
10 |
duplicated_from: Wauplin/pynecone-on-spaces-template
|
11 |
---
|
12 |
+
## 🤗 Spaces x Pynecone
|
13 |
|
14 |
+
This News_pynecone Space has been duplicated from the [Pynecone on Spaces template](https://huggingface.co/spaces/Wauplin/pynecone-on-spaces-template).
|
15 |
|
16 |
+
To host and deploy your own Pynecone app on 🤗 Spaces, check out [the instructions](https://huggingface.co/spaces/Wauplin/pynecone-on-spaces-template/blob/main/README.md).
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
default_app/default_app.py
DELETED
@@ -1,58 +0,0 @@
|
|
1 |
-
from pathlib import Path
|
2 |
-
|
3 |
-
import pynecone as pc
|
4 |
-
|
5 |
-
|
6 |
-
class State(pc.State):
|
7 |
-
count: int = 0
|
8 |
-
|
9 |
-
def increment(self):
|
10 |
-
self.count += 1
|
11 |
-
|
12 |
-
def decrement(self):
|
13 |
-
self.count -= 1
|
14 |
-
|
15 |
-
|
16 |
-
def index():
|
17 |
-
return pc.center(
|
18 |
-
pc.vstack(
|
19 |
-
pc.box(
|
20 |
-
height="20px",
|
21 |
-
),
|
22 |
-
pc.box(
|
23 |
-
pc.heading("Pynecone x Spaces 🚀"),
|
24 |
-
),
|
25 |
-
pc.hstack(
|
26 |
-
pc.button(
|
27 |
-
"Decrement",
|
28 |
-
color_scheme="red",
|
29 |
-
border_radius="1em",
|
30 |
-
on_click=State.decrement,
|
31 |
-
),
|
32 |
-
pc.heading(State.count, font_size="2em"),
|
33 |
-
pc.button(
|
34 |
-
"Increment",
|
35 |
-
color_scheme="green",
|
36 |
-
border_radius="1em",
|
37 |
-
on_click=State.increment,
|
38 |
-
),
|
39 |
-
),
|
40 |
-
pc.box(
|
41 |
-
height="50px",
|
42 |
-
),
|
43 |
-
pc.box(
|
44 |
-
# Hacky way to append README content to the page
|
45 |
-
pc.markdown(
|
46 |
-
(Path(__file__).parent.parent / "README.md")
|
47 |
-
.read_text()
|
48 |
-
.split("---")[-1]
|
49 |
-
.strip()
|
50 |
-
)
|
51 |
-
),
|
52 |
-
)
|
53 |
-
)
|
54 |
-
|
55 |
-
|
56 |
-
app = pc.App(state=State)
|
57 |
-
app.add_page(index)
|
58 |
-
app.compile()
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
{default_app → news_pynecone}/__init__.py
RENAMED
File without changes
|
news_pynecone/home.py
ADDED
@@ -0,0 +1,402 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
import os
|
2 |
+
import json
|
3 |
+
import requests
|
4 |
+
import feedparser
|
5 |
+
from typing import List
|
6 |
+
import pynecone as pc
|
7 |
+
from pygooglenews import GoogleNews
|
8 |
+
from newspaper import Article, Config
|
9 |
+
import nltk
|
10 |
+
import time
|
11 |
+
import openai
|
12 |
+
nltk.download('punkt')
|
13 |
+
|
14 |
+
|
15 |
+
class State(pc.State):
|
16 |
+
text: str = "Type something..."
|
17 |
+
titles: List[List] = []
|
18 |
+
img_src: str = ""
|
19 |
+
resource_href: List = []
|
20 |
+
src_meta: List[List] = []
|
21 |
+
summary: str = ""
|
22 |
+
middle_summary_state:str = ""
|
23 |
+
openai_key_show: bool = False
|
24 |
+
_valid_state = ["info", "error", "success"]
|
25 |
+
is_valid_code: str = _valid_state[0]
|
26 |
+
_engine = GoogleNews(lang="en", country="US")
|
27 |
+
_USER_AGENT = 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/58.0.3029.110 Safari/537.36'
|
28 |
+
_article_config = Config()
|
29 |
+
_article_config.request_timeout = 10
|
30 |
+
# _article_config.follow_meta_refresh=True
|
31 |
+
_article_config.browser_user_agent = _USER_AGENT
|
32 |
+
_prompt = '''
|
33 |
+
Please help me gather data from various media sources above and analyze it across multiple articles to recognize similarities and differences.
|
34 |
+
For instance, if several articles report on the launch of a new Tesla car, one source might state the retail price as $10, while another mentions it as $15.
|
35 |
+
The similarities between the articles would be that they all cover the new car launch, while the differences would be the varying retail prices reported.
|
36 |
+
The final output should include a summary paragraph followed by a list of similarities and differences,
|
37 |
+
where the differences are presented in the format of source A reporting a price of $10, while source B reports a price of $15.
|
38 |
+
response should be formed organized and neat in html layout.
|
39 |
+
'''
|
40 |
+
|
41 |
+
show_progress = False
|
42 |
+
tmp_openai_key_text = ""
|
43 |
+
OPENAI_API_KEY = "" # os.environ["my_openai_key"]
|
44 |
+
_ENDPOINT_URL = 'https://api.openai.com/v1/chat/completions'
|
45 |
+
_OPENAI_HEADER: dict = {}
|
46 |
+
|
47 |
+
def toggle_progress(self):
|
48 |
+
self.show_progress = not self.show_progress
|
49 |
+
|
50 |
+
def search(self):
|
51 |
+
self.summary = ""
|
52 |
+
self.middle_summary_state = ""
|
53 |
+
self.titles = []
|
54 |
+
self.src_meta = []
|
55 |
+
if self.text != "":
|
56 |
+
src_response = self._engine.search(self.text, when="3d")
|
57 |
+
# title, src, date, url
|
58 |
+
for t in src_response["entries"]:
|
59 |
+
self.titles.append(t["title"].split(" - ")[0])
|
60 |
+
self.src_meta.append(
|
61 |
+
[t["source"]["title"], t["published"], t["link"]])
|
62 |
+
|
63 |
+
def set_text(self, text):
|
64 |
+
self.text = f"intext:{text}"
|
65 |
+
|
66 |
+
def set_openai_key_text(self, text):
|
67 |
+
self.tmp_openai_key_text = text
|
68 |
+
|
69 |
+
def submit_openai_key(self):
|
70 |
+
# Define the API endpoint URL
|
71 |
+
endpoint_url = "https://api.openai.com/v1/models"
|
72 |
+
|
73 |
+
# Set the headers for the API request
|
74 |
+
headers = {
|
75 |
+
"Content-Type": "application/json",
|
76 |
+
"Authorization": f"Bearer {self.tmp_openai_key_text}"
|
77 |
+
}
|
78 |
+
# Send the API request
|
79 |
+
response = requests.get(endpoint_url, headers=headers)
|
80 |
+
|
81 |
+
self.is_valid_code = self._valid_state[int(
|
82 |
+
response.status_code == 200) + 1]
|
83 |
+
|
84 |
+
if self.is_valid_code == "success":
|
85 |
+
self.OPENAI_API_KEY = self.tmp_openai_key_text
|
86 |
+
self._OPENAI_HEADER = {"Content-Type": "application/json",
|
87 |
+
"Authorization": f"Bearer {self.OPENAI_API_KEY}"
|
88 |
+
}
|
89 |
+
|
90 |
+
def summarize(self, cur_title):
|
91 |
+
cur_index = self.titles.index(cur_title)
|
92 |
+
cur_src_meta = self.fetch_info(self.src_meta[cur_index][2])
|
93 |
+
print(self.is_valid_code)
|
94 |
+
self.middle_summary_state += "<strong>Start related content ........</strong>\n"
|
95 |
+
if cur_src_meta is not None and self.is_valid_code == "success":
|
96 |
+
self.img_src = cur_src_meta["image"]
|
97 |
+
self.middle_summary_state += "<strong>{}</strong>\n".format(f"Processing Main Article: {cur_src_meta['title']}")
|
98 |
+
related = self._engine.search(cur_src_meta['title'], when="7d")
|
99 |
+
# print("----------- Finish finding related news")
|
100 |
+
summary_list = [cur_src_meta["body"]]
|
101 |
+
self.resource_href.append(cur_src_meta["url"])
|
102 |
+
r_i = 1
|
103 |
+
for r in related["entries"][1:6]:
|
104 |
+
# print(f"+++++++++++++: {r['title']}")
|
105 |
+
r_meta = self.fetch_info(r["link"])
|
106 |
+
if r_meta is not None:
|
107 |
+
summary_list.append(r_meta["body"])
|
108 |
+
self.resource_href.append(r_meta["url"])
|
109 |
+
# print(f"-------------: {r['title']}")
|
110 |
+
self.middle_summary_state += "<strong>{}</strong>\n".format(f"Processing Related Article {r_i}: {r['title']}")
|
111 |
+
r_i += 1
|
112 |
+
time.sleep(3)
|
113 |
+
|
114 |
+
# self.summary = "\n".join(summary_list)
|
115 |
+
self.call_openai(summary_list)
|
116 |
+
# print("SUMMARY: ")
|
117 |
+
# print(self.summary)
|
118 |
+
elif cur_src_meta is not None and self.is_valid_code != "success":
|
119 |
+
self.summary = "<strong>Please set up your openai api-key before running NewsGPT</strong>"
|
120 |
+
else:
|
121 |
+
self.summary = "<strong>Please set up your openai api-key before running NewsGPT</strong>"
|
122 |
+
|
123 |
+
@pc.var
|
124 |
+
def get_summary(self):
|
125 |
+
if self.summary == "":
|
126 |
+
return self.middle_summary_state
|
127 |
+
else:
|
128 |
+
return self.summary
|
129 |
+
|
130 |
+
def fetch_info(self, rss_feed):
|
131 |
+
news_feed = feedparser.parse(rss_feed)
|
132 |
+
article = Article(news_feed["href"]) # , config=self._article_config)
|
133 |
+
try:
|
134 |
+
article.download()
|
135 |
+
article.parse()
|
136 |
+
article.nlp()
|
137 |
+
return \
|
138 |
+
{
|
139 |
+
"title": article.title,
|
140 |
+
"body": article.text,
|
141 |
+
"summary": article.summary,
|
142 |
+
"image": article.top_image,
|
143 |
+
"authors": article.authors,
|
144 |
+
"keywords": article.keywords,
|
145 |
+
"url": article.url
|
146 |
+
}
|
147 |
+
except:
|
148 |
+
return None
|
149 |
+
|
150 |
+
def call_openai(self, article_list):
|
151 |
+
summary_list = []
|
152 |
+
error_list = []
|
153 |
+
# print("All Articles:")
|
154 |
+
# print(". ".join(article_list))
|
155 |
+
s_i = 1
|
156 |
+
for idx, r in enumerate(article_list):
|
157 |
+
messages = [
|
158 |
+
{"role": "system", "content": "You are a very professional news artlce summization and analysis agent."},
|
159 |
+
{"role": "user", "content": r + " " + "summarize the article above and preserve information on \
|
160 |
+
the following concepts: Personnel/Human Resources, Time and Place, Object/Thing. "},
|
161 |
+
]
|
162 |
+
|
163 |
+
data = {
|
164 |
+
"model": "gpt-3.5-turbo",
|
165 |
+
"messages": messages,
|
166 |
+
"temperature": 0.7
|
167 |
+
}
|
168 |
+
|
169 |
+
try:
|
170 |
+
response = requests.post(
|
171 |
+
url=self._ENDPOINT_URL, headers=self._OPENAI_HEADER, data=json.dumps(data))
|
172 |
+
response_json = response.json()
|
173 |
+
# print(response_json)
|
174 |
+
answer = response_json["choices"][0]["message"]["content"].strip(
|
175 |
+
)
|
176 |
+
summary_list.append(answer)
|
177 |
+
error_list.append("")
|
178 |
+
self._middle_summary_state += "<strong>{}</strong>".format(f"Summarizing Article {s_i} ......")
|
179 |
+
except Exception as e:
|
180 |
+
error_list.append(e)
|
181 |
+
|
182 |
+
if len(summary_list):
|
183 |
+
summary_ = ", ".join([f"article {si} summary: {s}" for si, s in enumerate(
|
184 |
+
summary_list)]) + self._prompt
|
185 |
+
messages = [
|
186 |
+
{"role": "system", "content": "You are a very professional news artlce summization and analysis agent."},
|
187 |
+
{"role": "user", "content": summary_},
|
188 |
+
]
|
189 |
+
|
190 |
+
data = {
|
191 |
+
"model": "gpt-3.5-turbo",
|
192 |
+
"messages": messages,
|
193 |
+
"temperature": 0.7
|
194 |
+
}
|
195 |
+
# print(messages)
|
196 |
+
self._middle_summary_state += "<strong>{}</strong>".format(f"Analyzing All Articles .....")
|
197 |
+
try:
|
198 |
+
response = requests.post(
|
199 |
+
url=self._ENDPOINT_URL, headers=self._OPENAI_HEADER, data=json.dumps(data))
|
200 |
+
response_json = response.json()
|
201 |
+
# print(response_json)
|
202 |
+
self.summary = \
|
203 |
+
f"""
|
204 |
+
Reference article number: {len(summary_list)}
|
205 |
+
{response_json["choices"][0]["message"]["content"].strip()}
|
206 |
+
"""
|
207 |
+
# print("Finish Calling OPENAI")
|
208 |
+
except Exception as e:
|
209 |
+
self.summary = e
|
210 |
+
else:
|
211 |
+
# print(error_list)
|
212 |
+
self.summary = "Something went wrong"
|
213 |
+
|
214 |
+
def clear(self):
|
215 |
+
self.summary = ""
|
216 |
+
self.titles = []
|
217 |
+
self.src_meta = []
|
218 |
+
|
219 |
+
def openai_setup_window(self):
|
220 |
+
self.openai_key_show = not (self.openai_key_show)
|
221 |
+
|
222 |
+
@pc.var
|
223 |
+
def check_openai_setup(self):
|
224 |
+
return self.OPENAI_API_KEY != ""
|
225 |
+
|
226 |
+
article_box_style = {
|
227 |
+
"bg": "rgba(255,255,255, 0.5)",
|
228 |
+
"box_shadow": "3px -3px 7px #cccecf, -3px 3px 7px #ffffff",
|
229 |
+
"border_radius": "10px",
|
230 |
+
"height": "80px",
|
231 |
+
"width": "100%",
|
232 |
+
"margin_top": "0.75em",
|
233 |
+
"align_items": "left"
|
234 |
+
}
|
235 |
+
|
236 |
+
title_style = {
|
237 |
+
"padding": "1em",
|
238 |
+
"font_weight": "bold",
|
239 |
+
"font_size": "0.9em",
|
240 |
+
}
|
241 |
+
|
242 |
+
def article_card(data):
|
243 |
+
return \
|
244 |
+
pc.container(
|
245 |
+
pc.box(
|
246 |
+
pc.text(
|
247 |
+
data,
|
248 |
+
style=title_style
|
249 |
+
),
|
250 |
+
on_click=State.summarize(data),
|
251 |
+
style=article_box_style,
|
252 |
+
_hover={"cursor": "pointer"}
|
253 |
+
),
|
254 |
+
)
|
255 |
+
|
256 |
+
|
257 |
+
def home():
|
258 |
+
"""The home page."""
|
259 |
+
# return pc.center(
|
260 |
+
return \
|
261 |
+
pc.vstack(
|
262 |
+
pc.box(
|
263 |
+
pc.flex(
|
264 |
+
pc.link(
|
265 |
+
pc.text(
|
266 |
+
"NewsGPT",
|
267 |
+
font_size="1.5em",
|
268 |
+
font_weight=600,
|
269 |
+
background_image="linear-gradient(271.68deg, #EE756A 0.75%, #756AEE 88.52%)",
|
270 |
+
background_clip="text",
|
271 |
+
margin_left="20px"
|
272 |
+
),
|
273 |
+
href="/",
|
274 |
+
on_click=State.clear
|
275 |
+
),
|
276 |
+
pc.spacer(),
|
277 |
+
pc.hstack(
|
278 |
+
pc.input(
|
279 |
+
placeholder="New Topic?",
|
280 |
+
on_blur=State.set_text,
|
281 |
+
border_radius="10px",
|
282 |
+
width="450px"
|
283 |
+
),
|
284 |
+
pc.button(
|
285 |
+
"Search",
|
286 |
+
bg="lightgreen",
|
287 |
+
color="black",
|
288 |
+
is_active=True,
|
289 |
+
on_click=State.search,
|
290 |
+
size="sm",
|
291 |
+
border_radius="1em",
|
292 |
+
variant="outline",
|
293 |
+
),
|
294 |
+
),
|
295 |
+
pc.spacer(),
|
296 |
+
pc.button(
|
297 |
+
"OpenAI-KEY",
|
298 |
+
bg="lightgreen",
|
299 |
+
color="black",
|
300 |
+
is_active=True,
|
301 |
+
on_click=State.openai_setup_window,
|
302 |
+
size="sm",
|
303 |
+
border_radius="1em",
|
304 |
+
variant="outline",
|
305 |
+
margin_right="20px"
|
306 |
+
),
|
307 |
+
pc.alert_dialog(
|
308 |
+
pc.alert_dialog_overlay(
|
309 |
+
pc.alert_dialog_content(
|
310 |
+
pc.alert_dialog_header(
|
311 |
+
"OpenAI API-Key Setup"),
|
312 |
+
pc.alert_dialog_body(
|
313 |
+
pc.container(
|
314 |
+
pc.input(
|
315 |
+
placeholder="OpenAI Key",
|
316 |
+
on_blur=State.set_openai_key_text,
|
317 |
+
border_radius="10px",
|
318 |
+
width="100%",
|
319 |
+
type_="password",
|
320 |
+
),
|
321 |
+
),
|
322 |
+
),
|
323 |
+
pc.alert_dialog_footer(
|
324 |
+
pc.vstack(
|
325 |
+
pc.alert(
|
326 |
+
pc.alert_icon(),
|
327 |
+
pc.alert_title(
|
328 |
+
"set api-key [" +
|
329 |
+
State.is_valid_code + "]"
|
330 |
+
),
|
331 |
+
status=State.is_valid_code
|
332 |
+
),
|
333 |
+
pc.hstack(
|
334 |
+
pc.button(
|
335 |
+
"Submit",
|
336 |
+
on_click=State.submit_openai_key,
|
337 |
+
),
|
338 |
+
pc.button(
|
339 |
+
"Close",
|
340 |
+
on_click=State.openai_setup_window,
|
341 |
+
),
|
342 |
+
),
|
343 |
+
),
|
344 |
+
),
|
345 |
+
align_items="center"
|
346 |
+
),
|
347 |
+
),
|
348 |
+
is_open=State.openai_key_show,
|
349 |
+
),
|
350 |
+
align_items="center"
|
351 |
+
),
|
352 |
+
bg="rgba(255,255,255, 0.7)",
|
353 |
+
backdrop_filter="blur(10px)",
|
354 |
+
padding_y=["0.8em", "0.8em", "0.5em"],
|
355 |
+
border_bottom="0.08em solid rgba(32, 32, 32, .3)",
|
356 |
+
position="sticky",
|
357 |
+
width="80%",
|
358 |
+
top="20px",
|
359 |
+
z_index="99",
|
360 |
+
justify="center",
|
361 |
+
border_radius="20px",
|
362 |
+
),
|
363 |
+
pc.grid(
|
364 |
+
pc.grid_item(pc.spacer(), row_span=5, col_span=1),
|
365 |
+
pc.grid_item(
|
366 |
+
pc.vstack(
|
367 |
+
pc.foreach(
|
368 |
+
State.titles,
|
369 |
+
article_card
|
370 |
+
),
|
371 |
+
overflow="auto",
|
372 |
+
height="875px"
|
373 |
+
),
|
374 |
+
row_span=5,
|
375 |
+
col_span=3,
|
376 |
+
# bg="rgba(255,255,255, 0.9)",
|
377 |
+
margin_top="3em",
|
378 |
+
border_radius="20px",
|
379 |
+
box_shadow="7px -7px 14px #cccecf, -7px 7px 14px #ffffff"
|
380 |
+
),
|
381 |
+
# pc.grid_item(pc.spacer(), row_span=5, col_span=1),
|
382 |
+
pc.grid_item(
|
383 |
+
pc.box(
|
384 |
+
pc.html(State.get_summary, padding="10px"),
|
385 |
+
height="875px",
|
386 |
+
width="100%",
|
387 |
+
overflow="auto",
|
388 |
+
),
|
389 |
+
row_span=5,
|
390 |
+
col_span=3,
|
391 |
+
margin_top="3em",
|
392 |
+
border_radius="20px",
|
393 |
+
margin_left="50px",
|
394 |
+
box_shadow="7px -7px 14px #cccecf, -7px 7px 14px #ffffff"
|
395 |
+
),
|
396 |
+
template_rows="repeat(5, 1fr)",
|
397 |
+
template_columns="repeat(8, 1fr)",
|
398 |
+
width="100%",
|
399 |
+
),
|
400 |
+
pc.spacer(),
|
401 |
+
background="radial-gradient(circle at 22% 11%,rgba(62, 180, 137,.20),hsla(0,0%,100%,0) 19%),radial-gradient(circle at 82% 25%,rgba(33,150,243,.18),hsla(0,0%,100%,0) 35%),radial-gradient(circle at 25% 61%,rgba(250, 128, 114, .28),hsla(0,0%,100%,0) 55%)",
|
402 |
+
)
|
news_pynecone/news_pynecone.py
ADDED
@@ -0,0 +1,38 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
from pcconfig import config
|
2 |
+
import pynecone as pc
|
3 |
+
|
4 |
+
from .home import home, State
|
5 |
+
|
6 |
+
# class State(pc.State):
|
7 |
+
# """The app state."""
|
8 |
+
# pass
|
9 |
+
|
10 |
+
|
11 |
+
# def index() -> pc.Component:
|
12 |
+
# return pc.center(
|
13 |
+
# pc.vstack(
|
14 |
+
# pc.heading("Welcome to Pynecone!", font_size="2em"),
|
15 |
+
# pc.box("Get started by editing ", pc.code(filename, font_size="1em")),
|
16 |
+
# pc.link(
|
17 |
+
# "Check out our docs!",
|
18 |
+
# href=docs_url,
|
19 |
+
# border="0.1em solid",
|
20 |
+
# padding="0.5em",
|
21 |
+
# border_radius="0.5em",
|
22 |
+
# _hover={
|
23 |
+
# "color": "rgb(107,99,246)",
|
24 |
+
# },
|
25 |
+
# ),
|
26 |
+
# spacing="1.5em",
|
27 |
+
# font_size="2em",
|
28 |
+
# ),
|
29 |
+
# padding_top="10%",
|
30 |
+
# )
|
31 |
+
|
32 |
+
|
33 |
+
# Add state and page to the app.
|
34 |
+
app = pc.App(state=State, stylesheets=[
|
35 |
+
"https://cdnjs.cloudflare.com/ajax/libs/animate.css/4.1.1/animate.min.css",
|
36 |
+
],)
|
37 |
+
app.add_page(home, route="/")
|
38 |
+
app.compile()
|
pcconfig_docker.py
CHANGED
@@ -1,9 +1,9 @@
|
|
1 |
import pynecone as pc
|
2 |
|
3 |
config = pc.Config(
|
4 |
-
app_name="
|
5 |
db_url="sqlite:///pynecone.db",
|
6 |
-
api_url="https://
|
7 |
env=pc.Env.PROD,
|
8 |
)
|
9 |
|
|
|
1 |
import pynecone as pc
|
2 |
|
3 |
config = pc.Config(
|
4 |
+
app_name="news_pynecone",
|
5 |
db_url="sqlite:///pynecone.db",
|
6 |
+
api_url="https://timho102003-news_pynecone.hf.space/pynecone-backend",
|
7 |
env=pc.Env.PROD,
|
8 |
)
|
9 |
|