CoreDump: CodeMirror weds HuggingFace API
Dump 062220221031
Gecko: With Dr Bheemaiah speed testing Github Copilot, I thought of marketing the concept to the rest of the community, which includes rebel factions in the maker coder community, in true Unix philosophy we pair with a code generator, and this linkedin post caught my attention.
In this episode, we integrate any number of APIs from Hugging Face to pair program with codemirror, so that one can practice exactly two approaches, better CARE or computer aided requirements engineering, to plantUML and write user stories, that the copilot automatically translates to code …
I thought of an Amazon Coding test with a codemirror window, from a github repository of questions,…
Waiting for the Codex Key ….
CodeGen is another model, which can be used with the Hugging Faces API, I fished this fish with my automatic fisherman.
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained(‘Salesforce/codegen-16B-mono’)
model = AutoModelForCausalLM.from_pretrained(‘Salesforce/codegen-16B-mono’)text = “def hello_world():”
input_ids = tokenizer(text, return_tensors=”pt”).input_idsgenerated_ids = model.generate(input_ids, max_length=128)
print(tokenizer.decode(generated_ids[0], skip_special_tokens=True))
Lisa: Susie thinks this is Big Fish, since the logo above is too big for her!
Gecko: We need to call this API, and integrate CodeGen, with CodeMirror.net
we write a function in JS, to call the API.
In the next episode we will have codegen running…