forked from Language-Technology-Assessment/main-database
-
Notifications
You must be signed in to change notification settings - Fork 0
Expand file tree
/
Copy pathOpenCodeInterpreter.yaml
More file actions
104 lines (85 loc) · 3.39 KB
/
OpenCodeInterpreter.yaml
File metadata and controls
104 lines (85 loc) · 3.39 KB
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
---
# Thank you for contributing!
# In filling out this yaml file, please follow the criteria as described here:
# https://osai-index.eu/contribute
# You're free to build on this work and reuse the data. It is licensed under CC-BY 4.0, with the
# stipulation that attribution should come in the form of a link to https://osai-index.eu/
# and a citation to the peer-reviewed paper in which the dataset & criteria were published:
# Liesenfeld, A. and Dingemanse, M., 2024. Rethinking open source generative AI: open-washing and the EU AI Act. In Proceedings of the 2024 ACM Conference on Fairness, Accountability, and Transparency (pp. 1774-1787).
# Organization tags:
# - National origin: China
# - Contributor type: Academic (Research community)
system:
name: OpenCodeInterpreter
link: https://huggingface.co/m-a-p/OpenCodeInterpreter-CL-70B
type: code
performanceclass: full
basemodelname: Llama-2-70B
endmodelname: OpenCodeInterpreter-CL-70B
endmodellicense: Apache-2.0
releasedate: 2024-02
notes: Coder model integrating execution and iterative refinement functionalities.
org:
name: Multimodal Art Projection
link: https://m-a-p.ai/
notes: Open-source AI research community.
# availability:
datasources_basemodel:
class: closed
link: https://ai.meta.com/research/publications/llama-2-open-foundation-and-fine-tuned-chat-models/
notes: Data nowhere disclosed or documented, and described only in the vaguest terms in a corporate preprint released by Meta
datasources_endmodel:
class: closed
link: https://huggingface.co/datasets/m-a-p/Code-Feedback
notes: Dataset for end model published on HuggingFace. Proprietary dataset used in intermediate CodeLlama model.
weights_basemodel:
class: partial
link: https://ai.meta.com/resources/models-and-libraries/llama-downloads/
notes: Download only after requesting access; requires signing a consent form
weights_endmodel:
class: open
link: https://huggingface.co/datasets/m-a-p/Code-Feedback
notes: Model weights published on HuggingFace.
trainingcode:
class: closed
link: https://github.com/OpenCodeInterpreter/OpenCodeInterpreter/
notes: Repo exists, but does not contain training code.
# documentation:
code:
class: closed
link:
notes: No code, so no documentation.
hardware_architecture:
class: closed
link:
notes: No hardware architecture description provided.
preprint:
class: open
link: https://arxiv.org/pdf/2402.14658
notes: Preprint made available through arXiv.
paper:
class: open
link: https://aclanthology.org/2024.findings-acl.762/
notes: Paper published in ACL.
modelcard:
class: closed
link: https://huggingface.co/m-a-p/OpenCodeInterpreter-CL-70B
notes: Model card provides minimal information, only describing benchmarking and usage.s
datasheet:
class: closed
link: https://huggingface.co/datasets/m-a-p/Code-Feedback
notes: Datasheet for base model unavailable. Datasheet for end model contains limited information.
# access:
package:
class: closed
link:
notes: No package found.
api:
class: open
link: https://huggingface.co/spaces/m-a-p/OpenCodeInterpreter_demo
notes: HuggingFace space containing model available.
metaprompt: closed
licenses:
class: open
link: https://huggingface.co/m-a-p/OpenCodeInterpreter-CL-70B
notes: Apache-2.0, an OSI-approved license.