HCZhang commited on
Commit
0dc1232
1 Parent(s): 8df9e15

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +88 -19
README.md CHANGED
@@ -48,7 +48,7 @@ generated by GPT-4.
48
 
49
  The two versions are designed for different application scenarios.
50
  Jellyfish-13B is suitable for integration into larger data management systems due to its simple and clear responses that can be easily transformed into code.
51
- On the other hand, Jellyfish-13B-Reasoning is more user-oriented, with responses that offer deeper insights into the data.
52
 
53
  **Jellyfish paper will be coming soon!**
54
 
@@ -82,7 +82,9 @@ We used LoRA to speed up the training process, targeting the q_proj and v_proj m
82
  ## Uses
83
 
84
  <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
85
- Here are the prompts we used for both fine-tuning the model and for inference. Feel free to explore different prompts on your own to achieve the best generation quality.
 
 
86
 
87
  ### JellyFish-13B
88
  #### For Entity Matching
@@ -90,9 +92,9 @@ Here are the prompts we used for both fine-tuning the model and for inference. F
90
  You are tasked with determining whether two records listed below are the same based on the information provided.
91
  Carefully compare the {attribute 1}, {attribute 2}... for each record before making your decision.
92
  Note: Missing values (N/A or \"nan\") should not be used as a basis for your decision.
93
- Record A: [{attribute 1}: {attribute 1 value}, {attribute 2}: {attribute 2 value}...]
94
- Record B: [{attribute 1}: {attribute 1 value}, {attribute 2}: {attribute 2 value}...]
95
- Are record A and record B the same entity? Choose your answer from: [Yes, No]
96
  ```
97
  #### For Data Imputation
98
  ```
@@ -104,34 +106,34 @@ Based on the provided record, what would you infer is the value for the missing
104
  Answer only the value of {attribute X}.
105
  ```
106
  #### For Error Detection
107
- _There are two forms of the error detection task.
108
- In the first form, a complete record row is provided, and the task is to determine if a specific value is erroneous.
109
- In the second form, only the value of a specific attribute is given, and the decision about its correctness is based solely on the attribute's name and value.
110
  The subsequent prompt examples pertain to these two forms, respectively._
111
  ```
112
  Your task is to determine if there is an error in the value of a specific attribute within the whole record provided.
113
  The attributes may include {attribute 1}, {attribute 2}, ...
114
  Errors may include, but are not limited to, spelling errors, inconsistencies, or values that don't make sense given the context of the whole record.
115
- Record [{attribute 1}: {attribute 1 value}, {attribute 2}: {attribute 2 value}...]
116
  Attribute for Verification: [{attribute X}: {attribute X value}]
117
- Question: Is there an error in the value of {attribute X}? Choose your answer from: [Yes, No]
118
  ```
119
  ```
120
  Your task is to determine if there is an error in the value of a specific attribute.
121
- The attributes may belong to a healthcare-related record and could be one of the following: {attribute 1}, {attribute 2}, ...
122
  Errors can include, but are not limited to, spelling errors, inconsistencies, or values that don't make sense for that attribute.
123
  Note: Missing values (N/A or \"nan\") are not considered errors.
124
  Attribute for Verification: [{attribute X}: {attribute X value}]
125
- Question: Is there an error in the value of {attribute X}? Choose your answer from: [Yes, No]
126
  ```
127
  #### For Schema Matching
128
  ```
129
  Your task is to determine if the two attributes (columns) are semantically equivalent in the context of merging two tables.
130
  Each attribute will be provided by its name and a brief description.
131
  Your goal is to assess if they refer to the same information based on these names and descriptions provided.
132
- Attribute A is [name: {the value of name}, description: {the value of description}].
133
- Attribute B is [name: {the value of name}, description: {the value of description}].
134
- Are Attribute A and Attribute B semantically equivalent? Choose your answer from: [Yes, No]
135
  ```
136
 
137
  ### For JellyFish-13B-reasoning
@@ -140,17 +142,84 @@ Are Attribute A and Attribute B semantically equivalent? Choose your answer from
140
  You are tasked with determining whether two products listed below are the same based on the information provided.
141
  Carefully examine all the attributes before making your decision.
142
  Note: Missing values (N/A or \"nan\") should not be used as a basis for your decision.
143
- Record A: [{attribute 1}: {attribute 1 value}, {attribute 2}: {attribute 2 value}...]
144
- Record B: [{attribute 1}: {attribute 1 value}, {attribute 2}: {attribute 2 value}...]
145
  Are record A and record B the same entity?
146
  After your reasoning, finish your response in a separate line with and ONLY with your final answer. Choose your final answer from [Yes, No].
147
  ```
148
  #### For Data Imputation
149
-
 
 
 
 
 
 
 
 
150
  #### For Error Detection
151
-
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
152
  #### For Schema Matching
 
 
 
 
 
 
 
 
153
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
154
 
155
  <!--
156
  ## Bias, Risks, and Limitations
 
48
 
49
  The two versions are designed for different application scenarios.
50
  Jellyfish-13B is suitable for integration into larger data management systems due to its simple and clear responses that can be easily transformed into code.
51
+ On the other hand, Jellyfish-13B-Reasoning is more user-oriented, with responses that provide them with in-depth data insights without the necessity for advanced coding skills or an intricate grasp of statistics..
52
 
53
  **Jellyfish paper will be coming soon!**
54
 
 
82
  ## Uses
83
 
84
  <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
85
+ We provide the prompts used for both the model's fine-tuning and inference.
86
+ You can structure your data accordingly to these prompts.
87
+ However, we encourage experimenting with different prompts to potentially achieve optimal generation quality.
88
 
89
  ### JellyFish-13B
90
  #### For Entity Matching
 
92
  You are tasked with determining whether two records listed below are the same based on the information provided.
93
  Carefully compare the {attribute 1}, {attribute 2}... for each record before making your decision.
94
  Note: Missing values (N/A or \"nan\") should not be used as a basis for your decision.
95
+ Record A: [{attribute 1}: {attribute 1 value}, {attribute 2}: {attribute 2 value}, ...]
96
+ Record B: [{attribute 1}: {attribute 1 value}, {attribute 2}: {attribute 2 value}, ...]
97
+ Are record A and record B the same entity? Choose your answer from: [Yes, No].
98
  ```
99
  #### For Data Imputation
100
  ```
 
106
  Answer only the value of {attribute X}.
107
  ```
108
  #### For Error Detection
109
+ _There are two forms of the error detection task.
110
+ In the first form, a complete record row is provided, and the task is to determine if a specific value is erroneous.
111
+ In the second form, only the value of a specific attribute is given, and the decision about its correctness is based solely on the attribute's name and value.
112
  The subsequent prompt examples pertain to these two forms, respectively._
113
  ```
114
  Your task is to determine if there is an error in the value of a specific attribute within the whole record provided.
115
  The attributes may include {attribute 1}, {attribute 2}, ...
116
  Errors may include, but are not limited to, spelling errors, inconsistencies, or values that don't make sense given the context of the whole record.
117
+ Record [{attribute 1}: {attribute 1 value}, {attribute 2}: {attribute 2 value}, ...]
118
  Attribute for Verification: [{attribute X}: {attribute X value}]
119
+ Question: Is there an error in the value of {attribute X}? Choose your answer from: [Yes, No].
120
  ```
121
  ```
122
  Your task is to determine if there is an error in the value of a specific attribute.
123
+ The attributes may belong to a {keyword} record and could be one of the following: {attribute 1}, {attribute 2}, ...
124
  Errors can include, but are not limited to, spelling errors, inconsistencies, or values that don't make sense for that attribute.
125
  Note: Missing values (N/A or \"nan\") are not considered errors.
126
  Attribute for Verification: [{attribute X}: {attribute X value}]
127
+ Question: Is there an error in the value of {attribute X}? Choose your answer from: [Yes, No].
128
  ```
129
  #### For Schema Matching
130
  ```
131
  Your task is to determine if the two attributes (columns) are semantically equivalent in the context of merging two tables.
132
  Each attribute will be provided by its name and a brief description.
133
  Your goal is to assess if they refer to the same information based on these names and descriptions provided.
134
+ Attribute A is [name: {value of name}, description: {value of description}].
135
+ Attribute B is [name: {value of name}, description: {value of description}].
136
+ Are Attribute A and Attribute B semantically equivalent? Choose your answer from: [Yes, No].
137
  ```
138
 
139
  ### For JellyFish-13B-reasoning
 
142
  You are tasked with determining whether two products listed below are the same based on the information provided.
143
  Carefully examine all the attributes before making your decision.
144
  Note: Missing values (N/A or \"nan\") should not be used as a basis for your decision.
145
+ Record A: [{attribute 1}: {attribute 1 value}, {attribute 2}: {attribute 2 value}, ...]
146
+ Record B: [{attribute 1}: {attribute 1 value}, {attribute 2}: {attribute 2 value}, ...]
147
  Are record A and record B the same entity?
148
  After your reasoning, finish your response in a separate line with and ONLY with your final answer. Choose your final answer from [Yes, No].
149
  ```
150
  #### For Data Imputation
151
+ ```
152
+ You are presented with a {keyword} record that is missing a specific attribute {attribute X}.
153
+ Your task is to deduce or infer the manufacturer of the product using the available information in the record.
154
+ You may be provided with fields like {attribute 1}, {attribute 2}, ... to help you in the inference.
155
+ Record: [{attribute 1}: {attribute 1 value}, {attribute 2}: {attribute 2 value}, ...]
156
+ Based on the provided product record, what would you infer is the value for the missing attribute {attribute X}?
157
+ After your reasoning, finish your response in a separate line with and ONLY with your final answer.
158
+ Your final answer should only consist of the value of {attribute X}.
159
+ ```
160
  #### For Error Detection
161
+ ```
162
+ Your task is to determine if there is an error in the value of a specific attribute within the whole record provided.
163
+ Errors may include, but are not limited to, spelling errors, inconsistencies, or values that don't make sense given the context of the whole record.
164
+ Record [{attribute 1}: {attribute 1 value}, {attribute 2}: {attribute 2 value}, ...]
165
+ Attribute for verification: [{attribute X}: {attribute X value}]
166
+ Question: Is there an error in the value of {attribute X}?
167
+ After your reasoning, finish your response in a separate line with and ONLY with your final answer. Choose your final answer from [Yes, No].
168
+ ```
169
+ ```
170
+ Your task is to determine if there is an error in the value of a specific attribute.
171
+ The attributes may belong to a {keyword} record.
172
+ Errors can include, but are not limited to, spelling errors, inconsistencies, or values that don't make sense for that attribute.
173
+ Note: Missing values (N/A or \"nan\") are not considered errors.
174
+ Attribute for Verification: [{attribute X}: {attribute X value}]
175
+ Question: Is there an error in the value of {attribute X}?
176
+ After your reasoning, finish your response in a separate line with and ONLY with your final answer. Choose your final answer from [Yes, No].",
177
+ ```
178
  #### For Schema Matching
179
+ ```
180
+ Your task is to determine if the two attributes (columns) are semantically equivalent in the context of merging two tables.
181
+ Each attribute will be provided by its name and a brief description.
182
+ Your goal is to assess if they refer to the same information based on these names and descriptions provided.
183
+ Attribute A is [name: {value of name}, description: {value of description}].
184
+ Attribute B is [name: {value of name}, description: {value of description}].
185
+ After your reasoning, finish your response in a separate line with and ONLY with your final answer. Choose your final answer from [Yes, No].
186
+ ```
187
 
188
+ ## Sample Responses from Jellyfish-13B-Reasoning
189
+ We've provided a few sample responses from Jellyfish-13B-Reasoning to demonstrate its performance.
190
+
191
+ _For easier readability, we display the raw data record instead of the entire prompt._
192
+
193
+ 1. ```
194
+ Product A: [name: \"adobe creative suite cs3 web standard [ mac ]\", manufacturer: \"adobe\", price: \"999.0\"]
195
+ Product B: [name: \"adobe creative suite 3 ( cs3 ) web standard 3 windows upsell from dreamweaver or flash\", price: \"799.0\"]
196
+
197
+ Product A and Product B are not the same. Although both products are Adobe Creative Suite CS3 Web Standard, they are for different operating systems.
198
+ Product A is for Mac and Product B is for Windows. This difference in operating systems indicates that the products are not the same, as they are designed for different hardware and software environments.
199
+ Additionally, Product B mentions that it is an upsell from Dreamweaver or Flash, which further differentiates it from Product A.
200
+ No
201
+ ```
202
+ 2. ```
203
+ Article A: [title: \"l/mrp : a buffer management strategy for interactive continuous data flows in a multimedia dbms\", authors: \"f moser , a kraiss , w klas\", venue: \"vldb\", year: \"1995\"]
204
+ Article B: [title: \"l/mrp : a buffer management strategy for interactive continuous data flows in a multimedia dbms\", authors: \"f moser , w klas\", venue: \"proceedings of the international conference on very large & hellip ; ,\", year: \"1995.0\"]
205
+
206
+ The titles of both Article A and Article B are identical, which is a strong indicator that they are the same article.
207
+ The authors of both articles are also the same, with the exception that Article B does not include the author \"a kraiss\". However, this could be due to a mistake in data entry or a missing author in Article B.
208
+ The venue for both articles is also similar, with Article A listed as \"vldb\" and Article B listed as \"proceedings of the international conference on very large & hellip ; ,\" which is likely the same conference.
209
+ The year of publication is also the same for both articles, with Article A listed as \"1995\" and Article B listed as \"1995.0\".
210
+ Given these similarities, it is reasonable to conclude that Article A and Article B are the same.
211
+ Yes
212
+ ```
213
+ 3. ```
214
+ Record: [name: "le montrachet bistro", addr: "3000 paradise rd.", phone: "702-732-5651", type: "french bistro"].
215
+ missing attribute: "City"
216
+
217
+ The restaurant record provides the name, address, phone number, and type of restaurant. The name and type do not provide any specific information about the city. However, the address and phone number can be used to infer the city.
218
+ The address, "3000 paradise rd.", could potentially be in many different cities. However, the phone number, "702-732-5651", provides a clue. The area code, 702, is used in the state of Nevada, specifically in the Las Vegas metropolitan area.
219
+ Therefore, based on the area code of the phone number, it can be inferred that the restaurant is located in Las Vegas.
220
+ Las Vegas
221
+ ```
222
+
223
 
224
  <!--
225
  ## Bias, Risks, and Limitations