/g/ - Technology

Computers, Software, Technology

Posting Mode: Reply Return

Max message length: 5000


(used to delete files and postings)


  • Supported file types: GIF, JPG, PNG, WebM, OGG, and more
  • Max files: 5
  • Max file size: 50.00 MB
  • Read the global rules before you post, as well as the board rules found in the sticky.

02/27/20 IRC/Matrix/and Discord servers are now available, join here.
02/09/20 /ausneets/ has been added!
11/23/19 Donations can now be made via PayPal, thank you for all of your support!
11/21/19 We have successfully migrated to LynxChan 2.3.0, to see all of the changes to 16chan, click here.

[Index] [Catalog] [Archive] [Bottom] [Refresh]

(223.45 KB 1365x630 Screenshot_2019-09-20_21-14-27.png)
Anonymous 09/21/2019 (Sat) 04:50:51 No. 91
Hi everyone! I have a question about image serving. I'm making a 4chan clone, and currently what I do is I download the image from the client, make thumbnails on the web server, and then I use rsync to send it to my "image server" which is just an Ubuntu box that serves a folder impartially. An issue (I think) is that I end up downloading the user's file twice. I download it once on the web server, and then again on the "image server."

I also have to make thumbnails, in which case I just use imagemagick. Currently, I make the thumbnails on the web server because I will likely have multiple instances of them running, it seems fitting to have harsher processes there. As a result, I also send the thumbnails and more to the "image server."

One issue I have is that I also have to manually delete files via rsync. While nginx is fast, if I were to soft delete a file in my database (the database contains basic file information like filename, format, date, and more), I wouldn't be able to forward the changes and prevent the box from serving the files. I depend on rsync to work, when I would prefer "checking" if a file is allowed to be shared before I send it to the client. Do you think it would be worth it to write up a REST api to ensure more control?

As you can see, there is no sync between the file system and the database, and this can result in me serving illegal content or for stale content to stack up and waste space.

Do you guys have any suggestions to how I can improve this architecture? Are there better ways to handle this? I'm going to have one instance of an "image server" and multiple instances of my web servers. I really don't want some random power outage to occur and end up serving illegal content unknowingly because I would have no way to sync my file system the database.

One solution I was thinking of was forwarding the request to a custom REST api that would communicate with the database and check if the files are safe to share. The only issue is that this doesn't solve the duplicate download....

Any advice would be helpful!

(pic related, it's what the app currently looks like)
Cool project Anon, just wondering though, why reinvent the wheel? Why not just use a modern updated imageboard engine that already exists (this assumes that this isn't a passion project)?
It's out of passion and hopefully will look nice on my resume! I currently have one other project on it, but it's unfortunately really basic...

This will be my first "real" project! I'm honestly proud of what I've written so far. It's almost done!
It looks really awesome Anon you should update as every once in a while with your progress.
cool project anon, gl.
will there be an r9k clone?
There will likely be a board for r9k-ish topics, but I will probably not implement unique posting. It just seems like a feature that nobody uses, and unfortunately doesn't promote unique replies since everyone types originno after their posts
You need a database like SQLite to handle the photos rapidly on your ubuntu machine.

Don't bother converting thumbnails. Just render the image with a smaller width and hieght in html. The user's browser will do the work for you.
Thank you for the input, but using the client doesn't solve the issue of speed. I solved file syncing by using a custom REST api, and now I don't have to worry about it anymore! Thumbnails aren't a hassle to make, and the speedup from serving them is definitely worth the time implementing.
I finally deployed! :D
(99.83 KB 696x598 Selection_359.png)
>Do you guys have any suggestions to how I can improve this architecture?
Do what glowniggers do and ditch the database. Use a flat file instead. Database laws no longer apply.

>an attempt was made
>Database laws no longer apply
What did you mean by this? What are these Database laws you refereed to?
(64.36 KB 828x909 1565922670857.jpg)
>why doesn't X support webp??!!
>add webp support
>breaks on Mac
>now need to store webp and jpeg versions for all files (at least for thumbnails)
>takes up extra storage and time to compute

I still can't believe Safari doesn't support webp... for fucks sake even Palemoon does it
is just cancer the worst file format out there I hate how nothing can work with it.
Can we force all websites to drop webp support? And go back to the old days?
there isn't wrong with webp other than it's lack of support. If webp had the same support as jpg in the browser and in applications, then we wouldn't be complaining. Fuck apple and fuck safari for blocking progress
Fuck goygle and fuck their webp standard even if it is "open format", goygle will inch their bullshit in like goygle did with chrome and web standard. And fuck you.
This only less hostile.
>there isn't wrong with webp other than it's lack of support

Yet here we are and its almost 2020 and this is still not supported.

You know how its like to download a picture and find out its webp? So I need to run my script(linux) to convert it to PNG? Its infuriating!

>there isn't wrong with webp
I actually hate webp for the fact that it hides what it is a webp can be a crap format like jpeg or a perfect format like PNG only no one knows because webp is fucken container and both of these things can be in it so you have no idea if its PNG(perfect quality will not degrade, takes lots of space) or jpeg(degrades, crap quality however less space).

I hope no one supports webp and everyone drops support for it, especially google.
(364.48 KB 576x454 1572053957473.gif)
>An issue (I think) is that I end up downloading the user's file twice. I download it once on the web server, and then again on the "image server." mfw it finally hits me that I should've streamed the file from the client to the webserver and then finally to the image server instead of waiting for it down fully download on the webserver. I wish I wasn't using Multer and opted for busboy early on so I could use streaming... but the even larger issue is that my code is such a mess that it's not easy to change something fundamental. I'm working on a different project now, but if I could go back in time, I would use MinIO to store images and stream the images over the over the webserver to solve the duplicate download, would use a different architecture than your basic MVC, and I would write the front end using Next.js rather than plain React. Thank you for reading my blog
>>291 Wut?! Uh... using thumbnails will SIGNIFICANTLY reduce bandwidth and increase loading speeds... also browser resizing is usually shit level bicubic or bilinear... or if you're using IE I think they're still stuck back in nearest neighbor resizing. I like using PHP with the Lanczos algorithm, in conjunction with cached thumbnails. You can create "perfect liquid" websites like this one I made back in the early 2Ks... http://mimkrys.epizy.com/qausi.html (you can also click on the window and drag it around) Resize your browser to any size you like and the entire site will recode itself on the fly... http://mimkrys.epizy.com/internals.js God Level coding! :D ...not very practical, but fun for "proof of concept" stuff.
>>638 your website is just a blank white page without javashit turned on and with it it's Google telling me they won't let me on unless I let them store cookies in my browser
>>655 Well it sounds like you need an adult to help you figure out how turn on javascript and enable cookies. I mean, I guess it ~sounds~ pretty... self-explanatory, but uh... some tards are just slower than others I guess. *shrugs*
>>743 OpenNSFW wasn't designed for hentai. I'll need to learn how to use Tensorflow and train my own model, but until then keep posting haha


no cookies?