yuvraj17
Llama3-8B-SuperNova-Spectrum-Hermes-DPO-GGUF
Llama-3-8B-spectrum-25-GGUF
EvolCodeLlama-3.1-8B-Instruct
Llama3-8B-abliterated-Spectrum-slerp
Llama3-8B-abliterated-Spectrum-slerp is a merge of the following models using LazyMergekit: yuvraj17/Llama-3-8B-spectrum-25 mlabonne/Meta-Llama-3.1-8B-Instruct-abliterated Model Merging, also known as model fusion, is an effective technique that merges the parameters of multiple separate models with different capabilities to build a universal model without needing access to the original training data or expensive computation. There are bunch of methods, we can use to merge the capabilities of different models (supported by mergekit) including: For more deep-diving into different merging techniques, visit Merge Large Language Models with mergekit. Spherical Linear Interpolation (SLERP) is a method used to smoothly interpolate between two vectors. It maintains a constant rate of change and preserves the geometric properties of the spherical space in which the vectors reside. SLERP is currently the most-popular merging method, preffered over traditional methods because instead of dealing with straight-lines, the interpolation occurs on the surface of a sphere, and it has achieved improved performance to very diverse task. > But SLERP is limited to combining only two models at a time, although its possible to hierarchically combine multiple models, as shown in Mistral-7B-Merge-14-v0.1. Special thanks & Reference - Maxime Labonne for their easy-to-use colab-notebook Merging LLMs with MergeKit and Blog - Authors of Mergekit