⚓ T191804 Allow to store files between 4 and 5 GB
Page Menu
Phabricator
Create Task
Maniphest
T191804
Allow to store files between 4 and 5 GB
Closed, Resolved
Public
Actions
Edit Task
Edit Related Tasks...
Create Subtask
Edit Parent Tasks
Edit Subtasks
Merge Duplicates In
Close As Duplicate
Edit Related Objects...
Edit Commits
Edit Mocks
Mute Notifications
Protect as security issue
Assigned To
Bawolff
Authored By
Dereckson
Apr 9 2018, 3:05 PM
2018-04-09 15:05:49 (UTC+0)
Tags
Multimedia
(Untriaged)
Commons
(Incoming)
MediaWiki-File-management
(Backlog)
SRE-swift-storage
(Inbox)
media-backups
Data-Persistence-Backup
(Done)
User-notice-archive
(Backlog)
Referenced Files
None
Subscribers
aaron
Aklapper
AlexisJazz
Bawolff
Dereckson
Don-vip
Frostly
View All 18 Subscribers
Description
Currently, Mediawiki can use a swift backend to store files. Out of the box, without any provision for larger objects, swift is able to save file up to 5 GB.
A filesize limit currently exists at 2^32 (4 GB).
It would be convenient to allow MediaWiki to store until 5 GB when the swift backend is used.
For that, we should ensure file sizes are stored as a 64 bits integer, not as a 32 one.
See also
enwiki: Village pump (proposals) - RfC: Increasing the maximum size for uploaded files
Details
Related Changes in Gerrit:
Subject
Repo
Branch
Lines +/-
Increase $wgMaxUploadSize to 5 GiB (previously was 4GiB).
operations/mediawiki-config
master
+1
-1
sql: Migrate mediabackups metadata size from int to bigint
operations/software/mediabackups
master
+2
-2
Customize query in gerrit
Related Objects
Search...
Task Graph
Mentions
Status
Subtype
Assigned
Task
Resolved
Bawolff
T191804
Allow to store files between 4 and 5 GB
Resolved
Bawolff
T191805
Allow Mediawiki to store file size greater than 32 bits
Resolved
aaron
T348294
FSFileBackend spends a lot of time generating uneeded sha1 hashes that are expensive for large files
Resolved
ABran-WMF
T348183
Apply schema change for changing img_size, oi_size, us_size, and fa_size to BIGINT
Open
None
T357184
Consider increasing $wgTranscodeBackgroundSizeLimit to 5GB
Mentioned In
T382859: Server-side upload request for Koavf
T295007: Upload by URL should use the job queue, possibly chunked with range requests
T351023: Raise hard limit for transcode sizes to 4 GiB or higher
rOSMBee2c49752320: sql: Migrate mediabackups metadata size from int to bigint
T191805: Allow Mediawiki to store file size greater than 32 bits
T281520: Create Wikinewsie's Portal wiki
T219081: Please upload two ~15GB files to Wikimedia Commons
T224606: Upload 11 GB file "Bergensbanen full-length.webm" to Wikimedia Commons
T202157: Please upload a large file to Wikimedia Commons
T191802: [Epic] Determine a strategy to store files between 5 and 100 GB
Mentioned Here
T191805: Allow Mediawiki to store file size greater than 32 bits
T348183: Apply schema change for changing img_size, oi_size, us_size, and fa_size to BIGINT
Event Timeline
Dereckson
created this task.
Apr 9 2018, 3:05 PM
2018-04-09 15:05:49 (UTC+0)
Restricted Application
added a project:
Commons
View Herald Transcript
Apr 9 2018, 3:05 PM
2018-04-09 15:05:50 (UTC+0)
Restricted Application
added a subscriber:
Aklapper
View Herald Transcript
Dereckson
updated the task description.
(Show Details)
Apr 9 2018, 3:06 PM
2018-04-09 15:06:20 (UTC+0)
zhuyifei1999
subscribed.
Apr 9 2018, 3:33 PM
2018-04-09 15:33:21 (UTC+0)
Reedy
added a project:
SRE-swift-storage
Apr 9 2018, 3:59 PM
2018-04-09 15:59:06 (UTC+0)
Dereckson
updated the task description.
(Show Details)
Apr 9 2018, 4:16 PM
2018-04-09 16:16:08 (UTC+0)
Dereckson
added subscribers:
Bawolff
aaron
Bawolff
closed subtask
T191805: Allow Mediawiki to store file size greater than 32 bits
as
Declined
Apr 9 2018, 4:24 PM
2018-04-09 16:24:00 (UTC+0)
revi
subscribed.
Apr 12 2018, 7:51 AM
2018-04-12 07:51:57 (UTC+0)
fgiunchedi
mentioned this in
T191802: [Epic] Determine a strategy to store files between 5 and 100 GB
Apr 16 2018, 9:42 AM
2018-04-16 09:42:00 (UTC+0)
Mholloway
subscribed.
Apr 16 2018, 2:04 PM
2018-04-16 14:04:21 (UTC+0)
Yann
subscribed.
Jun 7 2018, 4:59 PM
2018-06-07 16:59:33 (UTC+0)
Urbanecm
mentioned this in
T202157: Please upload a large file to Wikimedia Commons
Aug 17 2018, 9:24 PM
2018-08-17 21:24:03 (UTC+0)
Framawiki
subscribed.
Aug 19 2018, 8:56 AM
2018-08-19 08:56:55 (UTC+0)
Krenair
subscribed.
May 20 2019, 12:21 AM
2019-05-20 00:21:45 (UTC+0)
Urbanecm
mentioned this in
T224606: Upload 11 GB file "Bergensbanen full-length.webm" to Wikimedia Commons
May 29 2019, 6:46 PM
2019-05-29 18:46:27 (UTC+0)
Urbanecm
mentioned this in
T219081: Please upload two ~15GB files to Wikimedia Commons
Jun 2 2019, 9:18 PM
2019-06-02 21:18:54 (UTC+0)
Bugreporter
reopened subtask
T191805: Allow Mediawiki to store file size greater than 32 bits
as
Open
Jan 14 2020, 11:24 PM
2020-01-14 23:24:02 (UTC+0)
RhinosF1
subscribed.
Mar 30 2021, 4:54 PM
2021-03-30 16:54:45 (UTC+0)
Ltrlg
subscribed.
Apr 2 2021, 8:05 AM
2021-04-02 08:05:39 (UTC+0)
Zabe
subscribed.
Apr 12 2021, 9:00 AM
2021-04-12 09:00:27 (UTC+0)
Urbanecm
mentioned this in
T281520: Create Wikinewsie's Portal wiki
Apr 29 2021, 11:50 PM
2021-04-29 23:50:03 (UTC+0)
TheresNoTime
removed a subscriber:
RhinosF1
Dec 15 2022, 11:36 PM
2022-12-15 23:36:27 (UTC+0)
Frostly
subscribed.
Oct 2 2023, 2:52 AM
2023-10-02 02:52:20 (UTC+0)
Bawolff
mentioned this in
T191805: Allow Mediawiki to store file size greater than 32 bits
Oct 4 2023, 6:52 PM
2023-10-04 18:52:54 (UTC+0)
Bawolff
added a subscriber:
MatthewVernon
Edited
Oct 6 2023, 5:10 PM
2023-10-06 17:10:41 (UTC+0)
Comment Actions
@MatthewVernon
I think you're the right person to ask. With work being done to make MediaWiki no longer be limited to 4GB files, I was wondering what SRE's opinion on increasing the commons file size limit to 5GB. My understanding is that this is the limit for non-split up files in swift. If the MediaWiki limitations were removed, do you think it would be reasonable to increase the limit at Wikimedia? Having mildly larger files would presumably put more pressure on our media storage/serving infrastructure, although i would expect that the number of 5GB files would be fairly limited.
Long term it would be great to support really big files with swift large objects, but 5GB seems like a small improvement we can do almost today with very little work.
There is also the middle ground option of not allowing users to upload 5gb files in general, but still allowing server side uploads of such files via importImages.php
Bugreporter
renamed this task from
Allow to store files between 4 and 5 Gb
to
Allow to store files between 4 and 5 GB
Oct 15 2023, 3:03 PM
2023-10-15 15:03:10 (UTC+0)
Bugreporter
updated the task description.
(Show Details)
Bawolff
closed subtask
T191805: Allow Mediawiki to store file size greater than 32 bits
as
Resolved
Oct 27 2023, 10:07 AM
2023-10-27 10:07:19 (UTC+0)
Comment Actions
As a note on current status, as part of
T191805
, mediawiki will now accept files with swift up to 5GB. $wgMaxUploadSize is 4gb, so this only affects files uploaded from command line via importImages.php. Additionally the schema change (
T348183
) is not deployed as of yet.
jcrespo
subscribed.
Nov 9 2023, 3:11 PM
2023-11-09 15:11:06 (UTC+0)
Comment Actions
Please loop me in in the progress, while this doesn't affect production, I may have assumed in some cases that files were always smaller than 4 GB for backups, and I may need to review its storage compatibility- even if it is just applying the same schema change on backup metadata.
AlexisJazz
subscribed.
Nov 9 2023, 9:32 PM
2023-11-09 21:32:09 (UTC+0)
Comment Actions
jcrespo
added a comment.
Nov 10 2023, 9:35 AM
2023-11-10 09:35:34 (UTC+0)
Comment Actions
@AlexisJazz
While we are happy that you are excited about this, this is by far not ready for discussion. Developers just handed out the code, but this requires still a lot of preparation and discussion to be able to be implemented at WMF by system administrators due to the scale of operations- with a lot of open questions regarding Swift extra space needed, backup compatibility, schema changes deployment, and many other work needed.
While community input is very much welcome, I feel that
asking that is not ok at the moment
, as no matter how much voting would be for yes,
it cannot be enabled at the moment
. I would suggest for you to
pause any discussion on wiki to avoid disappointment
- no amount of support will make technical problems solve faster, and I feel you should wait for the option to be available first in our servers (it is not right now, and it won't be until all engineers agree it is ready). For example: storing more data requires more disk space, and that requires a budget to buy more servers, and that is usually approved at the end of the fiscal year, if approved. Sorry, but things take time.
Please understand that the title "Allow to store files between 4 and 5 GB" refers to the technical ability- and the preparation needed for that, not community consensus.
JEumerus
subscribed.
Nov 10 2023, 10:22 AM
2023-11-10 10:22:36 (UTC+0)
AlexisJazz
added a comment.
Edited
Nov 10 2023, 11:58 AM
2023-11-10 11:58:53 (UTC+0)
Comment Actions
@jcrespo
thanks for letting me know. I misunderstood Bawolff's comment.
Well, I can partially answer one of your open questions. You won't really need extra space for English Wikipedia. The largest file on enwiki is currently just shy of 300MB. I'd be surprised if even 20 feature films transcoded to >4GB would be uploaded over the duration of a whole year. 20GB (say 30GB including MediaWiki transcodes) is probably a rounding error for you anyway. If I had believed my proposal would result in a substantial increase of the storage space needed I would have asked you first, but the use case for >4GB files on enwiki, while there certainly is one, is limited in quantity. It could be more substantial if enwiki suddenly decided to mass-upload all the PD-USonly feature films that can be found, but that would be more substantial regardless of whether the limit is 4GB or 5GB.
Even on Commons I doubt you'll notice the impact. Commons
currently has
about 660 files over 3GB. About 435 of those are over 3.5GB. The oldest one is from 2012:
and the second oldest is from 2016. I'd estimate that 500GB/year extra will probably cover it for the next few years. 500GB (say 750GB including MW transcodes) is less than 1% of the storage I have at my disposal. I'd be severely worried if you couldn't handle it.. (I know your storage costs way more per GB, backups, redundancy, cache, etc, but still!)
Drop me a note on my talk page on enwiki when this is available on betacommons if you need help testing.
Bawolff
added a comment.
Nov 10 2023, 12:19 PM
2023-11-10 12:19:51 (UTC+0)
Comment Actions
In
T191804#9319565
@jcrespo
wrote:
Please loop me in in the progress, while this doesn't affect production, I may have assumed in some cases that files were always smaller than 4 GB for backups, and I may need to review its storage compatibility- even if it is just applying the same schema change on backup metadata.
To clarify, do you just want to be informed or is backup support blocking this? The reason i ask is one potential implementation path (nothing has been decided yet or even really talked about yet) is in the beginning allow uploading a few large files on special request before allowing it in general. If so, that might happen quicker then you think for a small number of files as there are less capacity issues if its just special cases. To emphasize, nothing has been decided, that is just one potential path.
Re Alexis - yeah, this is too early for community consensus. However in the event that any community people have objections to increasing, please let me know. I don't suspect this to be a controversial change when it eventually happens, but if there are any problems/objections i would like to know early.
jcrespo
added a comment.
Edited
Nov 10 2023, 12:25 PM
2023-11-10 12:25:25 (UTC+0)
Comment Actions
Thank you,
@AlexisJazz
that's useful feedback that without doubt will make our media storage happy- still there are additional technical operations and challenges to overcome. Cost is not as much the concern (specially for enwiki needs)- we get more concerned about Commons with its almost half a PB of storage, but still servers have to be purchased, racked and installed, data resharded, and everything planned, and it takes some time- it is not a question of just "buying larger disks". :-D
Please keep subscribed for more updates.
is backup support blocking this
It is not a hard blocker, as I am guessing not a lot of files will be uploaded soon- and they may just fail being backuped up, and we can retry them later, but I would like to fix that ASAP- and as I didn't know this was ongoing it was a bit of sudden news. Nothing that cannot be solved- I believe MinIO max object size is 50TB, so only metadata storage has to be reviewed. If it later grows beyond that it may impact the storage decisions in the future.
jcrespo
added a comment.
Nov 10 2023, 12:32 PM
2023-11-10 12:32:19 (UTC+0)
Comment Actions
Indeed, the same schema change for production has to be applied to backup metadata, as we mirrored the size from mediawiki as an unsigned int:
Please give me until next week to apply that schema change- should be easy.
gerritbot
added a comment.
Nov 10 2023, 5:48 PM
2023-11-10 17:48:45 (UTC+0)
Comment Actions
Change 973364 had a related patch set uploaded (by Jcrespo; author: Jcrespo):
[operations/software/mediabackups@master] sql: Migrate mediabackups metadata size from int to bigint
gerritbot
added a project:
Patch-For-Review
Nov 10 2023, 5:48 PM
2023-11-10 17:48:46 (UTC+0)
Framawiki
unsubscribed.
Nov 13 2023, 12:27 AM
2023-11-13 00:27:33 (UTC+0)
gerritbot
added a comment.
Nov 13 2023, 10:04 AM
2023-11-13 10:04:23 (UTC+0)
Comment Actions
Change 973364
merged
by Jcrespo:
[operations/software/mediabackups@master] sql: Migrate mediabackups metadata size from int to bigint
jcrespo
mentioned this in
rOSMBee2c49752320: sql: Migrate mediabackups metadata size from int to bigint
Nov 13 2023, 10:05 AM
2023-11-13 10:05:26 (UTC+0)
Maintenance_bot
removed a project:
Patch-For-Review
Nov 13 2023, 10:10 AM
2023-11-13 10:10:46 (UTC+0)
TheDJ
mentioned this in
T351023: Raise hard limit for transcode sizes to 4 GiB or higher
Nov 15 2023, 1:37 PM
2023-11-15 13:37:27 (UTC+0)
jcrespo
added a project:
media-backups
Nov 17 2023, 11:09 AM
2023-11-17 11:09:52 (UTC+0)
Stashbot
added a comment.
Nov 17 2023, 11:10 AM
2023-11-17 11:10:08 (UTC+0)
Comment Actions
Mentioned in SAL (#wikimedia-operations)
[2023-11-17T11:10:08Z]
T191804
Stashbot
added a comment.
Nov 17 2023, 11:20 AM
2023-11-17 11:20:47 (UTC+0)
Comment Actions
Mentioned in SAL (#wikimedia-operations)
[2023-11-17T11:20:46Z]
T191804
jcrespo
added a project:
Data-Persistence-Backup
Nov 22 2023, 8:28 AM
2023-11-22 08:28:59 (UTC+0)
Comment Actions
This is now deployed and media-backups schema is up to date. Media backups are flowing as usual. I am no longer a blocker here.
jcrespo
moved this task from
Triage
to
Done
on the
Data-Persistence-Backup
board.
Nov 22 2023, 8:29 AM
2023-11-22 08:29:13 (UTC+0)
Bawolff
added a comment.
Nov 23 2023, 5:30 AM
2023-11-23 05:30:21 (UTC+0)
Comment Actions
As a note
My expectation is that the primary usage for a higher limit would be:
Long (or HD) videos which previously had to use more aggressive compression to fit in 4GB
Videos which previously had to be split up into multiple files
Right now we are averaging 12 files > 3GB a month on commons (overwritten and deleted files are negligible), and 3 files / month > 3.9GB. [enwiki has 0].
It seems for the most part large files are special cases, and I'm doubtful that increasing the limit will affect much in terms of capacity.
AlexisJazz
added a comment.
Nov 23 2023, 6:19 AM
2023-11-23 06:19:30 (UTC+0)
Comment Actions
In
T191804#9354216
@Bawolff
wrote:
As a note
My expectation is that the primary usage for a higher limit would be:
Long (or HD) videos which previously had to use more aggressive compression to fit in 4GB
Videos which previously had to be split up into multiple files
Right now we are averaging 12 files > 3GB a month on commons (overwritten and deleted files are negligible), and 3 files / month > 3.9GB. [enwiki has 0].
It seems for the most part large files are special cases, and I'm doubtful that increasing the limit will affect much in terms of capacity.
There's one more use I think: the 4K transcode of
failed, presumably due to the 4GB limit.
Unrelated: note
which is under 4 minutes yet it's hugging the 4GB limit. This may be lossless?
Bawolff
added a comment.
Edited
Nov 23 2023, 6:44 AM
2023-11-23 06:44:18 (UTC+0)
Comment Actions
There's one more use I think: the 4K transcode of https://commons.wikimedia.org/wiki/File:Politparade.webm failed, presumably due to the 4GB limit.
More likely it hit a timeout. The lower resolution transcodes were already taking 9 hours. However if the transcode was > 4GB then the FileStoreRepo changes would fix it. Anyways, this task won't fix everything about large video files, there are still aspects that are going to be shaky for very large files. [
edit
after resetting the transcode it worked. The transcode is 3.88GB which is right on the edge. Perhaps it would have been over in an earlier version of the transcoding software]
Unrelated: note
which is under 4 minutes yet it's hugging the 4GB limit. This may be lossless?
Its 60 fps 4K video. If it was lossless I'd expect it to be a lot more than 4GB. Given its from a video game,maybe some sort of streaming setup was used where the video was encoded live. Often those setups trade latency for less efficient compression. The bitrate is 151 mbps. I believe normal bitrate for 60fps 4k video is usually around 60 mbps, so its only about triple normal.
AlexisJazz
added a comment.
Edited
Nov 23 2023, 11:34 AM
2023-11-23 11:34:56 (UTC+0)
Comment Actions
In
T191804#9354264
@Bawolff
wrote:
There's one more use I think: the 4K transcode of https://commons.wikimedia.org/wiki/File:Politparade.webm failed, presumably due to the 4GB limit.
More likely it hit a timeout. The lower resolution transcodes were already taking 9 hours. However if the transcode was > 4GB then the FileStoreRepo changes would fix it. Anyways, this task won't fix everything about large video files, there are still aspects that are going to be shaky for very large files.
How about both? The 1440p VP9 transcode is already 3.9GB.
Some transcodes that would result in a >4GB but <5GB file are probably failing now but should succeed once the limit is raised. This might result in a sudden little bump in storage use. (though given the number of existing large videos it's probably still a drop in the bucket)
Unrelated: note
which is under 4 minutes yet it's hugging the 4GB limit. This may be lossless?
Its 60 fps 4K video. If it was lossless I'd expect it to be a lot more than 4GB.
I'm unsure how efficient VP9 lossless encoding is, never tried it. I worked with uncompressed video in the past, IIRC that was ~1GB/minute for SD video using
Huffyuv
But being footage of a video game that's not in constant motion, if the codec only saves image data that changed compared to the previous frame and/or if it compresses multiple frames using the same dictionary I suspect 1GB/minute is maybe possible. Also, video game footage in this particular case possibly compresses better than live action footage.
Given its from a video game,maybe some sort of streaming setup was used where the video was encoded live. Often those setups trade latency for less efficient compression. The bitrate is 151 mbps. I believe normal bitrate for 60fps 4k video is usually around 60 mbps, so its only about triple normal.
I've analyzed some screenshots and you're right. There are some compression artifacts visible around the text when zoomed in. Maybe the high bitrate is on purpose after all, this game has a lot of detail and sharp edges (and also includes some text) which would be quite susceptible to degradation from lossy compression.
MatthewVernon
added a comment.
Nov 28 2023, 11:45 AM
2023-11-28 11:45:26 (UTC+0)
Comment Actions
I think it's fair to say 12 5GB files a month would not be overwhelming (about 2TB of raw capacity per cluster per year given 3x replication, cf. our current growth rate of
very
approximately 120TB/year), and the underlying filesystems that swift sits upon could cope with some 5GB objects.
Don-vip
subscribed.
Dec 1 2023, 11:58 PM
2023-12-01 23:58:59 (UTC+0)
Bawolff
added a comment.
Edited
Feb 8 2024, 7:53 PM
2024-02-08 19:53:41 (UTC+0)
Comment Actions
Server side uploads should now work up to 5gb.
I think we should upload a few > 4gb files via server side upload just to make sure it all works, before enabling for chunked upload.
Bawolff
added a comment.
Feb 13 2024, 6:13 AM
2024-02-13 06:13:17 (UTC+0)
Comment Actions
is the first file > 4GB on commons!
gerritbot
added a comment.
Feb 13 2024, 6:13 AM
2024-02-13 06:13:42 (UTC+0)
Comment Actions
Change 1002813 had a related patch set uploaded (by Brian Wolff; author: Brian Wolff):
[operations/mediawiki-config@master] Increase $wgMaxUploadSize to 5 GiB (previously was 4GiB).
gerritbot
added a project:
Patch-For-Review
Feb 13 2024, 6:13 AM
2024-02-13 06:13:43 (UTC+0)
Bawolff
added a project:
User-notice
Feb 13 2024, 6:14 AM
2024-02-13 06:14:17 (UTC+0)
gerritbot
added a comment.
Feb 13 2024, 8:18 AM
2024-02-13 08:18:45 (UTC+0)
Comment Actions
Change 1002813
merged
by jenkins-bot:
[operations/mediawiki-config@master] Increase $wgMaxUploadSize to 5 GiB (previously was 4GiB).
Stashbot
added a comment.
Feb 13 2024, 8:19 AM
2024-02-13 08:19:58 (UTC+0)
Comment Actions
Mentioned in SAL (#wikimedia-operations)
[2024-02-13T08:19:58Z]
T191804
)]]
Stashbot
added a comment.
Feb 13 2024, 8:21 AM
2024-02-13 08:21:41 (UTC+0)
Comment Actions
Mentioned in SAL (#wikimedia-operations)
[2024-02-13T08:21:41Z]
T191804
)]] synced to the testservers (
hashar
subscribed.
Feb 13 2024, 8:27 AM
2024-02-13 08:27:27 (UTC+0)
Comment Actions
In
T191804#9321105
@AlexisJazz
wrote:
That is now archived at
hashar
updated the task description.
(Show Details)
Feb 13 2024, 8:28 AM
2024-02-13 08:28:13 (UTC+0)
Stashbot
added a comment.
Feb 13 2024, 8:28 AM
2024-02-13 08:28:56 (UTC+0)
Comment Actions
Mentioned in SAL (#wikimedia-operations)
[2024-02-13T08:28:55Z]
T191804
)]] (duration: 08m 57s)
Maintenance_bot
removed a project:
Patch-For-Review
Feb 13 2024, 8:30 AM
2024-02-13 08:30:42 (UTC+0)
Bawolff
closed this task as
Resolved
Feb 13 2024, 8:33 AM
2024-02-13 08:33:14 (UTC+0)
Bawolff
claimed this task.
UOzurumba
subscribed.
Feb 16 2024, 12:31 AM
2024-02-16 00:31:14 (UTC+0)
Comment Actions
@Bawolff
Re: Tech News - What wording would you suggest as the content, and When should it be included? Thanks!
UOzurumba
moved this task from
To Triage
to
Announce in next Tech/News
on the
User-notice
board.
Feb 16 2024, 4:07 AM
2024-02-16 04:07:07 (UTC+0)
UOzurumba
moved this task from
Announce in next Tech/News
to
In current Tech/News draft
on the
User-notice
board.
Feb 16 2024, 4:23 AM
2024-02-16 04:23:16 (UTC+0)
Bawolff
added a comment.
Feb 16 2024, 4:52 AM
2024-02-16 04:52:41 (UTC+0)
Comment Actions
In
T191804#9549256
@UOzurumba
wrote:
@Bawolff
Re: Tech News - What wording would you suggest as the content, and When should it be included? Thanks!
"The maximum file size when using Upload Wizard is now 5 GiB."
Change has been deployed to all wikis, it can go out anytime.
UOzurumba
moved this task from
In current Tech/News draft
to
Already announced/Archive
on the
User-notice
board.
Feb 22 2024, 3:56 PM
2024-02-22 15:56:49 (UTC+0)
Maintenance_bot
edited projects, added
User-notice-archive
; removed
User-notice
Mar 3 2024, 4:30 PM
2024-03-03 16:30:13 (UTC+0)
Jeff_G
mentioned this in
T295007: Upload by URL should use the job queue, possibly chunked with range requests
Mar 23 2024, 4:05 PM
2024-03-23 16:05:00 (UTC+0)
hashar
unsubscribed.
Apr 10 2024, 1:02 PM
2024-04-10 13:02:47 (UTC+0)
Zabe
mentioned this in
T382859: Server-side upload request for Koavf
Jan 1 2025, 10:19 PM
2025-01-01 22:19:57 (UTC+0)
Log In to Comment
Content licensed under Creative Commons Attribution-ShareAlike (CC BY-SA) 4.0 unless otherwise noted; code licensed under GNU General Public License (GPL) 2.0 or later and other open source licenses. By using this site, you agree to the Terms of Use, Privacy Policy, and Code of Conduct.
Wikimedia Foundation
Code of Conduct
Disclaimer
CC-BY-SA
GPL
Credits
US