No, it provides an API as an interface, it is not consuming an API.
I don’t think you understand what this is. It is a local model, it runs locally. It provides an API which you can then use in the same manner as you would the ChatGPT API. I’m not super familiar with GPT4all since llama.cpp/kobold.cpp are pretty much the standard in local inferrence, but for example llama-cpp-python provides an OpenAI compatible API.
Perhaps not alot, but it will save someone new from having to learn an entire set of obtuse modal keybinds.
in material design those would be cards, like here
Give me fish or give me vanilla bash.
I will absolutely take that bet. Given both how unpopular the decision is combined with it being even tangentially related to the gaming community I would be astonished if they didn’t receive death threats.