Philipp
58a6c34099
Added a CI/CD Graph
2024-01-24 19:44:57 +01:00
github-actions[bot]
cac95c8525
build(deps): bump transformers from 4.36.2 to 4.37.0 ( #579 )
...
Bumps [transformers](https://github.com/huggingface/transformers ) from
4.36.2 to 4.37.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/huggingface/transformers/releases ">transformers's
releases</a>.</em></p>
<blockquote>
<h2>v4.37 Qwen2, Phi-2, SigLIP, ViP-LLaVA, Fast2SpeechConformer, 4-bit
serialization, Whisper longform generation</h2>
<h2>Model releases</h2>
<h3>Qwen2</h3>
<p>Qwen2 is the new model series of large language models from the Qwen
team. Previously, the Qwen series was released, including Qwen-72B,
Qwen-1.8B, Qwen-VL, Qwen-Audio, etc.</p>
<p>Qwen2 is a language model series including decoder language models of
different model sizes. For each size, we release the base language model
and the aligned chat model. It is based on the Transformer architecture
with SwiGLU activation, attention QKV bias, group query attention,
mixture of sliding window attention and full attention, etc.
Additionally, we have an improved tokenizer adaptive to multiple natural
languages and codes.</p>
<ul>
<li>Add qwen2 by <a
href="https://github.com/JustinLin610 "><code>@JustinLin610</code></a>
in <a
href="https://redirect.github.com/huggingface/transformers/issues/28436 ">#28436</a></li>
</ul>
<h3>Phi-2</h3>
<p>Phi-2 is a transformer language model trained by Microsoft with
exceptionally strong performance for its small size of 2.7 billion
parameters. It was previously available as a custom code model, but has
now been fully integrated into transformers.</p>
<ul>
<li>[Phi2] Add support for phi2 models by <a
href="https://github.com/susnato "><code>@susnato</code></a> in <a
href="https://redirect.github.com/huggingface/transformers/issues/28211 ">#28211</a></li>
<li>[Phi] Extend implementation to use GQA/MQA. by <a
href="https://github.com/gugarosa "><code>@gugarosa</code></a> in <a
href="https://redirect.github.com/huggingface/transformers/issues/28163 ">#28163</a></li>
<li>update docs to add the <code>phi-2</code> example by <a
href="https://github.com/susnato "><code>@susnato</code></a> in <a
href="https://redirect.github.com/huggingface/transformers/issues/28392 ">#28392</a></li>
<li>Fixes default value of <code>softmax_scale</code> in
<code>PhiFlashAttention2</code>. by <a
href="https://github.com/gugarosa "><code>@gugarosa</code></a> in <a
href="https://redirect.github.com/huggingface/transformers/issues/28537 ">#28537</a></li>
</ul>
<h3>SigLIP</h3>
<p>The SigLIP model was proposed in Sigmoid Loss for Language Image
Pre-Training by Xiaohua Zhai, Basil Mustafa, Alexander Kolesnikov, Lucas
Beyer. SigLIP proposes to replace the loss function used in CLIP by a
simple pairwise sigmoid loss. This results in better performance in
terms of zero-shot classification accuracy on ImageNet.</p>
<ul>
<li>Add SigLIP by <a
href="https://github.com/NielsRogge "><code>@NielsRogge</code></a> in <a
href="https://redirect.github.com/huggingface/transformers/issues/26522 ">#26522</a></li>
<li>[SigLIP] Don't pad by default by <a
href="https://github.com/NielsRogge "><code>@NielsRogge</code></a> in <a
href="https://redirect.github.com/huggingface/transformers/issues/28578 ">#28578</a></li>
</ul>
<h3>ViP-LLaVA</h3>
<p>The VipLlava model was proposed in Making Large Multimodal Models
Understand Arbitrary Visual Prompts by Mu Cai, Haotian Liu, Siva Karthik
Mustikovela, Gregory P. Meyer, Yuning Chai, Dennis Park, Yong Jae
Lee.</p>
<p>VipLlava enhances the training protocol of Llava by marking images
and interact with the model using natural cues like a “red bounding box”
or “pointed arrow” during training.</p>
<ul>
<li>Adds VIP-llava to transformers by <a
href="https://github.com/younesbelkada "><code>@younesbelkada</code></a>
in <a
href="https://redirect.github.com/huggingface/transformers/issues/27932 ">#27932</a></li>
<li>Fix Vip-llava docs by <a
href="https://github.com/younesbelkada "><code>@younesbelkada</code></a>
in <a
href="https://redirect.github.com/huggingface/transformers/issues/28085 ">#28085</a></li>
</ul>
<h3>FastSpeech2Conformer</h3>
<p>The FastSpeech2Conformer model was proposed with the paper Recent
Developments On Espnet Toolkit Boosted By Conformer by Pengcheng Guo,
Florian Boyer, Xuankai Chang, Tomoki Hayashi, Yosuke Higuchi, Hirofumi
Inaguma, Naoyuki Kamo, Chenda Li, Daniel Garcia-Romero, Jiatong Shi,
Jing Shi, Shinji Watanabe, Kun Wei, Wangyou Zhang, and Yuekai Zhang.</p>
<p>FastSpeech 2 is a non-autoregressive model for text-to-speech (TTS)
synthesis, which develops upon FastSpeech, showing improvements in
training speed, inference speed and voice quality. It consists of a
variance adapter; duration, energy and pitch predictor and waveform and
mel-spectrogram decoder.</p>
<ul>
<li>Add FastSpeech2Conformer by <a
href="https://github.com/connor-henderson "><code>@connor-henderson</code></a>
in <a
href="https://redirect.github.com/huggingface/transformers/issues/23439 ">#23439</a></li>
</ul>
<h3>Wav2Vec2-BERT</h3>
<p>The Wav2Vec2-BERT model was proposed in Seamless: Multilingual
Expressive and Streaming Speech Translation by the Seamless
Communication team from Meta AI.</p>
<p>This model was pre-trained on 4.5M hours of unlabeled audio data
covering more than 143 languages. It requires finetuning to be used for
downstream tasks such as Automatic Speech Recognition (ASR), or Audio
Classification.</p>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="https://github.com/huggingface/transformers/commit/8e3e145b427196e014f37aa42ba890b9bc94275e "><code>8e3e145</code></a>
[<code>GPTNeoX</code>] Fix BC issue with 4.36 (<a
href="https://redirect.github.com/huggingface/transformers/issues/28602 ">#28602</a>)</li>
<li><a
href="https://github.com/huggingface/transformers/commit/344943b88a1df08b797721f600ab826371029a4a "><code>344943b</code></a>
Fix <code>_speculative_sampling</code> implementation (<a
href="https://redirect.github.com/huggingface/transformers/issues/28508 ">#28508</a>)</li>
<li><a
href="https://github.com/huggingface/transformers/commit/5fc3e60cd8824b4bcf31e1c51f5c29a83e277be1 "><code>5fc3e60</code></a>
[SigLIP] Don't pad by default (<a
href="https://redirect.github.com/huggingface/transformers/issues/28578 ">#28578</a>)</li>
<li><a
href="https://github.com/huggingface/transformers/commit/5ee9fcb5cc7c6684d79e02e9d76be045373786f7 "><code>5ee9fcb</code></a>
Fix wrong xpu device in DistributedType.MULTI_XPU mode (<a
href="https://redirect.github.com/huggingface/transformers/issues/28386 ">#28386</a>)</li>
<li><a
href="https://github.com/huggingface/transformers/commit/e156abd05a9910106351edaa4fa856c5ba93a0b5 "><code>e156abd</code></a>
[Whisper] Finalize batched SOTA long-form generation (<a
href="https://redirect.github.com/huggingface/transformers/issues/27658 ">#27658</a>)</li>
<li><a
href="https://github.com/huggingface/transformers/commit/a485e469f62e5eaa55e3e6b786a573a3b939e30c "><code>a485e46</code></a>
Add w2v2bert to pipeline (<a
href="https://redirect.github.com/huggingface/transformers/issues/28585 ">#28585</a>)</li>
<li><a
href="https://github.com/huggingface/transformers/commit/d381d85466db4b6e998212d9788e91d201792355 "><code>d381d85</code></a>
Release: v4.37.0</li>
<li><a
href="https://github.com/huggingface/transformers/commit/db9a7e9d3dbd1b595f004597a0502cce0a96135a "><code>db9a7e9</code></a>
Don't save <code>processor_config.json</code> if a processor has no
extra attribute (<a
href="https://redirect.github.com/huggingface/transformers/issues/2 ">#2</a>...</li>
<li><a
href="https://github.com/huggingface/transformers/commit/772307be7649e1333a933cfaa229dc0dec2fd331 "><code>772307b</code></a>
Making CTC training example more general (<a
href="https://redirect.github.com/huggingface/transformers/issues/28582 ">#28582</a>)</li>
<li><a
href="https://github.com/huggingface/transformers/commit/186aa6befecc6e6f022fed34019a00d60884d557 "><code>186aa6b</code></a>
[Whisper] Fix audio classification with weighted layer sum (<a
href="https://redirect.github.com/huggingface/transformers/issues/28563 ">#28563</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/huggingface/transformers/compare/v4.36.2...v4.37.0 ">compare
view</a></li>
</ul>
</details>
<br />
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores )
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
<details>
<summary>Dependabot commands and options</summary>
<br />
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
</details>
2024-01-23 23:33:56 +01:00
Sebastian
3c327966c3
Added section 4-3-2"archive processing" ( #574 )
...
Kurze Zusammenfassung über das Archive Processing.
---------
Co-authored-by: Philipp Horstenkamp <philipp@horstenkamp.de >
2024-01-23 15:19:11 +01:00
dependabot[bot]
34456bd48c
build(deps): bump transformers from 4.36.2 to 4.37.0
...
Bumps [transformers](https://github.com/huggingface/transformers ) from 4.36.2 to 4.37.0.
- [Release notes](https://github.com/huggingface/transformers/releases )
- [Commits](https://github.com/huggingface/transformers/compare/v4.36.2...v4.37.0 )
---
updated-dependencies:
- dependency-name: transformers
dependency-type: direct:production
update-type: version-update:semver-minor
...
Signed-off-by: dependabot[bot] <support@github.com >
2024-01-22 15:45:39 +00:00
Tim Ronneburg
05f692294a
Docs/abschlussarbeit tim ( #521 )
...
First Version of my Documentation.
2024-01-21 12:53:05 +01:00
Tim
974e109759
fixed pandas bug
2024-01-21 12:43:04 +01:00
Tim
9106e7e51a
install poetry logs
2024-01-21 11:37:09 +01:00
Tim
b458d761d8
added change requests
2024-01-21 11:34:30 +01:00
Tim Ronneburg
6e56e61323
Update documentations/Ergebnisse/Zwischenbericht_und_Praesentation/TiRo/verflechtungsanalyse.md
...
Co-authored-by: Philipp Horstenkamp <philipp@horstenkamp.de >
2024-01-21 11:34:29 +01:00
Tim
d781617fe5
poetry update
2024-01-21 11:34:29 +01:00
Tim Ronneburg
c912d72585
Update documentations/Ergebnisse/Abschlussbericht_und_Praesentation/TiRo/S2.md
...
Co-authored-by: Tristan Nolde <tristan.nolde@yahoo.de >
2024-01-21 11:33:41 +01:00
Tim
59ab16a870
Changed pandoc doc dependencies
2024-01-21 11:33:41 +01:00
Tim Ronneburg
eee2ac124f
Update documentations/Ergebnisse/Abschlussbericht_und_Praesentation/TiRo/S2.md
...
Co-authored-by: Tristan Nolde <tristan.nolde@yahoo.de >
2024-01-21 11:33:41 +01:00
Tim Ronneburg
2572a04448
Update documentations/Ergebnisse/Abschlussbericht_und_Praesentation/TiRo/S2.md
...
Co-authored-by: Tristan Nolde <tristan.nolde@yahoo.de >
2024-01-21 11:33:40 +01:00
Tim Ronneburg
1865098cd7
Update documentations/Ergebnisse/Abschlussbericht_und_Praesentation/TiRo/S2.md
...
Co-authored-by: Tristan Nolde <tristan.nolde@yahoo.de >
2024-01-21 11:33:40 +01:00
Tim
da68d7d0b3
Added Links and minor changes again
2024-01-21 11:33:39 +01:00
Tim
f78d4d83a7
Added Links and minor changes
2024-01-21 11:33:39 +01:00
Tim
23eaf763c0
Last fixes
2024-01-21 11:33:39 +01:00
Tim
6c87ef9841
Include pictures of the Frontend
2024-01-21 11:33:38 +01:00
Tim
2409d80112
Addes Chapter 5.2.4
2024-01-21 11:33:38 +01:00
Tim
10c3cf9288
Quick update
2024-01-21 11:33:37 +01:00
Tim
4b800cc523
Chapter 2 of the documentation completed.
2024-01-21 11:33:37 +01:00
Tim
3151b5265b
First Version of the network Documentation
2024-01-21 11:33:37 +01:00
Philipp
7cc0c1455a
Update pre-commit hooks ( #577 )
...
Update versions of pre-commit hooks to latest version.
2024-01-20 12:21:13 +01:00
Philipp
881ce019c7
Dependencie updates ( #576 )
2024-01-20 11:46:08 +01:00
philipp-horstenkamp
df7f5a11f6
chore: update pre-commit hooks
2024-01-20 07:33:44 +00:00
Philipp
73cdab5f2c
Updates
2024-01-19 21:47:03 +01:00
Philipp
7ca72191f8
Added a plot about the docker Architecture ( #564 )
...
Added a plot about the docker Architecture
2024-01-17 16:39:08 +01:00
Tristan Nolde
e57ef07a3d
hotfix: Improve UI ( #569 )
...
Co-authored-by: Philipp Horstenkamp <philipp@horstenkamp.de >
2024-01-17 16:22:46 +01:00
Philipp
555e71820d
Removed dead docstrings ( #570 )
...
Removed dead docstrings
2024-01-16 17:38:54 +01:00
Philipp
2f0fe65b59
Added a nan handling for network kpis and an scaler. ( #568 )
...
Added a nan handling for network kpis and an scaler.
2024-01-16 17:38:28 +01:00
Philipp
e1b4d7d8ee
Reworked the tests
2024-01-15 22:32:00 +01:00
Philipp
5c5b33153a
Fixed a test
2024-01-15 22:30:21 +01:00
Philipp
7c943b3997
Fixed a test
2024-01-15 22:28:34 +01:00
Philipp
b512eef308
Reverted some changes
2024-01-15 22:21:27 +01:00
Philipp
1785114ec1
Removed dead docstrings
2024-01-15 22:21:27 +01:00
Philipp
609599f9f3
Added a graph about the talks ( #565 )
...
Please comment about the plot?
Is something missing?
This should fill the slide about the areas of responsiblity that is used
as an introduction.
2024-01-15 22:14:27 +01:00
Philipp
93c0e94d97
Replaced the bind with the connection method ( #567 )
...
Der syntax den ich euch gezeigt habe der falsch.
Jetzt ist er richtig.
Sorry.
2024-01-15 21:13:47 +01:00
Philipp
77e52fd3e2
Added a nan handling for network kpis and an scaler.
2024-01-15 21:10:32 +01:00
Philipp
27a95c1c23
Replaced the bind with the connection method
2024-01-15 20:24:38 +01:00
Philipp
1b6553e853
Removed and renamed the graph
2024-01-15 20:14:42 +01:00
Sebastian
829780fd2b
Add files via upload
2024-01-15 20:01:51 +01:00
github-actions[bot]
3af9ff58d6
build(deps-dev): bump types-setuptools from 69.0.0.20240106 to 69.0.0.20240115 ( #566 )
...
Bumps [types-setuptools](https://github.com/python/typeshed ) from
69.0.0.20240106 to 69.0.0.20240115.
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/python/typeshed/commits ">compare view</a></li>
</ul>
</details>
<br />
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores )
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
<details>
<summary>Dependabot commands and options</summary>
<br />
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
</details>
2024-01-15 19:58:59 +01:00
dependabot[bot]
4a41cad72b
build(deps-dev): bump types-setuptools
...
Bumps [types-setuptools](https://github.com/python/typeshed ) from 69.0.0.20240106 to 69.0.0.20240115.
- [Commits](https://github.com/python/typeshed/commits )
---
updated-dependencies:
- dependency-name: types-setuptools
dependency-type: direct:development
update-type: version-update:semver-patch
...
Signed-off-by: dependabot[bot] <support@github.com >
2024-01-15 15:34:31 +00:00
Philipp
8b029dc003
Minor reformatting
2024-01-14 19:14:39 +01:00
Philipp
c36e380263
Added a graph about the talks
2024-01-14 19:08:35 +01:00
Tristan Nolde
f7832a9656
hotfix: Replace remote-debugging-port with remote-debugging-pipe ( #563 )
2024-01-14 12:11:17 +01:00
Philipp
c092a7ceca
Update CONTRIBUTING.md ( #561 )
...
Update CONTRIBUTING.md
2024-01-13 18:03:53 +01:00
Philipp
e31b0a3e8f
Update CONTRIBUTING.md
2024-01-13 09:59:03 +01:00
Philipp
0c8a6925b0
Update pre-commit hooks ( #562 )
...
Update versions of pre-commit hooks to latest version.
2024-01-13 09:58:06 +01:00