References for: assertion
Full identifier: https://w3id.org/np/RAPwHYQQtXh6p3DQQ066TmpKOBMIWkerAYv-chCViAqC0#assertion
Nanopublication | Part | Subject | Predicate | Object | Published By | Published On |
---|---|---|---|---|---|---|
links a nanopublication to its assertion
http://www.nanopub.org/nschema#hasAssertion
assertion
|
assertion
|
Sensenets
|
2024-09-03T21:16:16.131Z
|
|||
links a nanopublication to its provenance
http://www.nanopub.org/nschema#hasProvenance
provenance
|
assertion
|
Sensenets
|
2024-09-03T21:16:16.131Z
|
|||
links a nanopublication to its provenance
http://www.nanopub.org/nschema#hasProvenance
provenance
|
assertion
|
Sensenets
|
2024-09-03T21:16:16.131Z
|
|||
links a nanopublication to its assertion
http://www.nanopub.org/nschema#hasAssertion
assertion
|
assertion
|
Merging models trained for long with WIDEN
When models were trained on a lot of data they diverged further from the baseline (e.g. in continual pretraining for additional languages), current merging methods underperform in this setting
https://alphaxiv.org/pdf/2408.03092
@AlibabaGroup https://twitter.com/LChoshen/status/1823002789217493392/photo/1
How do you do that?
Let's assume we update a matrix with a few models.
Pick a pretrained model and consider the rest of the models as diff from it (task vectors)
Normalize the row of each model, separating the normalization factor (magnitude) and direction (row)
Now we weigh every row by how much it changed (higher = better) and average all together
+ some trick to sometimes keep the original weight so weights might not sum to 1.
You can see how this follows recent findings about direction and size (e.g. https://x.com/prateeky2806/status/1727589818618523783)
While the results in "just" merging are not changing that much, merging with a continually trained model (Sailor) that added many languages look quite good! https://twitter.com/LChoshen/status/1823002796259791276/photo/1
Criticism (@askalphaxiv didn't upload comment):
There is a vast overclaiming calling Sailor a different pretrained model.
Quite complex, hard to know if it will generalize
and they only show a specific model.
|
Sensenets
|
2024-09-03T21:16:16.131Z
|
||
links a nanopublication to its assertion
http://www.nanopub.org/nschema#hasAssertion
assertion
|
assertion
|
large-language-models
|
Sensenets
|
2024-09-03T21:16:16.131Z
|
||
links a nanopublication to its assertion
http://www.nanopub.org/nschema#hasAssertion
assertion
|
assertion
|
Sailor
|
Sensenets
|
2024-09-03T21:16:16.131Z
|
||
links a nanopublication to its assertion
http://www.nanopub.org/nschema#hasAssertion
assertion
|
assertion
|
WIDEN
|
Sensenets
|
2024-09-03T21:16:16.131Z
|
||
links a nanopublication to its assertion
http://www.nanopub.org/nschema#hasAssertion
assertion
|
assertion
|
model-merging
|
Sensenets
|
2024-09-03T21:16:16.131Z
|
||
links a nanopublication to its assertion
http://www.nanopub.org/nschema#hasAssertion
assertion
|
assertion
|
weight-disentanglement
|
Sensenets
|
2024-09-03T21:16:16.131Z
|
||
links a nanopublication to its provenance
http://www.nanopub.org/nschema#hasProvenance
provenance
|
assertion
|
Sensenets
|
2024-09-03T21:16:16.131Z
|
|||
links a nanopublication to its assertion
http://www.nanopub.org/nschema#hasAssertion
assertion
|
assertion
|
Sensenets
|
2024-09-03T21:16:16.131Z
|
|||
links a nanopublication to its assertion
http://www.nanopub.org/nschema#hasAssertion
assertion
|
assertion
|
Sensenets
|
2024-09-03T21:16:16.131Z
|
|||
links a nanopublication to its assertion
http://www.nanopub.org/nschema#hasAssertion
assertion
|
assertion
|
Sensenets
|
2024-09-03T21:16:16.131Z
|
|||
links a nanopublication to its assertion
http://www.nanopub.org/nschema#hasAssertion
assertion
|
assertion
|
Sensenets
|
2024-09-03T21:16:16.131Z
|
|||
links a nanopublication to its provenance
http://www.nanopub.org/nschema#hasProvenance
provenance
|
assertion
|
Sensenets
|
2024-09-03T21:16:16.131Z
|
|||
links a nanopublication to its provenance
http://www.nanopub.org/nschema#hasProvenance
provenance
|
assertion
|
Sensenets
|
2024-09-03T21:16:16.131Z
|