Discuss Scratch
- Discussion Forums
- » Collaboration
- » Scratchedia - The Internet on Scratch, that's safe (Applications Closed)
- IGABMS
- Scratcher
100+ posts
Scratchedia - The Internet on Scratch, that's safe (Applications Closed)
Alright thanks!Scratch is always like that. Most of the time i get notifs after a page has passed, but if your lucky you can instantly get a notif. I don't know if there is a specific rule of when you get the notif, or if its random.*if we host it on our own devices. Why are we hosting this on another site?
You guys do realize we're like 100x more likely to accidentially DDOS our networks on accident.
Guys- has anybody else been having issues with getting notified of new posts?
If you respond to something I write here, I probably won't see it, so let me know on my profile! Feel free to ask me questions about PHP, Python, Scratch, HTML, CSS, or JavaScript there, too!
- 101Corp
- Scratcher
1000+ posts
Scratchedia - The Internet on Scratch, that's safe (Applications Closed)
- kkidslogin
- Scratcher
1000+ posts
Scratchedia - The Internet on Scratch, that's safe (Applications Closed)
I am not going to join the collab, but here is an idea that I have tested and I know works in scratch 3:
If you have an extra computer lying around (A cheap raspberry pi 4 2GB will do), you can have it run the project continually in a “server mode”. A project in “client mode” send events over the cloud variables, and the server responds with data, possibly in steps. More after lunch
If you have an extra computer lying around (A cheap raspberry pi 4 2GB will do), you can have it run the project continually in a “server mode”. A project in “client mode” send events over the cloud variables, and the server responds with data, possibly in steps. More after lunch
- kkidslogin
- Scratcher
1000+ posts
Scratchedia - The Internet on Scratch, that's safe (Applications Closed)
A more detailed explanation:
There are four cloud variables: server_data, server_response?, client_data, and client_response?. When the server sends data, it sets server_data to the data it is sending. It also sets server_response? to 1. Meanwhile, the client (the people that are online) is constantly scanning for when server_response? equals 1. When it becomes 1, the client stores the data somewhere and sets server_response? to 0 so that it doesn't read the data multiple times. When the client sends data, the same thing happens, except it uses client_data and client_response?, and the server sets client_response? to 0 after reading the data in client_data.
Presumably, the server uses a list to store data like websites. The downsides are that there can only be 200,000 websites (Unless you are running it in turbowarp), that every website is lost when the server loses power, and that there is a limited amount of memory availible for websites (~1GB if using a Raspberry Pi 4 2GB).
There are four cloud variables: server_data, server_response?, client_data, and client_response?. When the server sends data, it sets server_data to the data it is sending. It also sets server_response? to 1. Meanwhile, the client (the people that are online) is constantly scanning for when server_response? equals 1. When it becomes 1, the client stores the data somewhere and sets server_response? to 0 so that it doesn't read the data multiple times. When the client sends data, the same thing happens, except it uses client_data and client_response?, and the server sets client_response? to 0 after reading the data in client_data.
Presumably, the server uses a list to store data like websites. The downsides are that there can only be 200,000 websites (Unless you are running it in turbowarp), that every website is lost when the server loses power, and that there is a limited amount of memory availible for websites (~1GB if using a Raspberry Pi 4 2GB).
- kkidslogin
- Scratcher
1000+ posts
Scratchedia - The Internet on Scratch, that's safe (Applications Closed)
*And live stuff gets even laggier.
- xouzouris
- Scratcher
100+ posts
Scratchedia - The Internet on Scratch, that's safe (Applications Closed)
Yes, scratch rarely notifies me of new posts, I have to check manually. I thought it was a feature, like when there are a lot of new posts in a forum it doesn't notify you because it might be annoying.*if we host it on our own devices. Why are we hosting this on another site?
You guys do realize we're like 100x more likely to accidentially DDOS our networks on accident.
Guys- has anybody else been having issues with getting notified of new posts?
>> Head of Scratchedia <<
Scratchedia is the recreation of the
internet, on Scratch, that's safe!
Random Stuff:
My little cousin asked me why we can't make and run a nuclear reactor at home…
Don't forget to keep on Scratchin'
- xouzouris
- Scratcher
100+ posts
Scratchedia - The Internet on Scratch, that's safe (Applications Closed)
First of all, I THOUGHT OF THE EXACT SAME THING O_O Second, I do have a raspberry pi. A more detailed explanation:
There are four cloud variables: server_data, server_response?, client_data, and client_response?. When the server sends data, it sets server_data to the data it is sending. It also sets server_response? to 1. Meanwhile, the client (the people that are online) is constantly scanning for when server_response? equals 1. When it becomes 1, the client stores the data somewhere and sets server_response? to 0 so that it doesn't read the data multiple times. When the client sends data, the same thing happens, except it uses client_data and client_response?, and the server sets client_response? to 0 after reading the data in client_data.
Presumably, the server uses a list to store data like websites. The downsides are that there can only be 200,000 websites (Unless you are running it in turbowarp), that every website is lost when the server loses power, and that there is a limited amount of memory availible for websites (~1GB if using a Raspberry Pi 4 2GB).
>> Head of Scratchedia <<
Scratchedia is the recreation of the
internet, on Scratch, that's safe!
Random Stuff:
My little cousin asked me why we can't make and run a nuclear reactor at home…
Don't forget to keep on Scratchin'
- kkidslogin
- Scratcher
1000+ posts
Scratchedia - The Internet on Scratch, that's safe (Applications Closed)
Ha! That's funny! I also have an add-on to this idea: if you don't want Chromium running 24/7, you can package a standalone offline package using Turbowarp Packager, and have cloud variables set to “Turbowarp's Online Server”. And perhaps pesterFirst of all, I THOUGHT OF THE EXACT SAME THING O_O Second, I do have a raspberry pi. A more detailed explanation:
There are four cloud variables: server_data, server_response?, client_data, and client_response?. When the server sends data, it sets server_data to the data it is sending. It also sets server_response? to 1. Meanwhile, the client (the people that are online) is constantly scanning for when server_response? equals 1. When it becomes 1, the client stores the data somewhere and sets server_response? to 0 so that it doesn't read the data multiple times. When the client sends data, the same thing happens, except it uses client_data and client_response?, and the server sets client_response? to 0 after reading the data in client_data.
Presumably, the server uses a list to store data like websites. The downsides are that there can only be 200,000 websites (Unless you are running it in turbowarp), that every website is lost when the server loses power, and that there is a limited amount of memory availible for websites (~1GB if using a Raspberry Pi 4 2GB).
-GarboMuffin- about adding cloud variables and locally stored variables to TurboWarp Desktop.
- IGABMS
- Scratcher
100+ posts
Scratchedia - The Internet on Scratch, that's safe (Applications Closed)
Hmmm. Interesting ideas!Ha! That's funny! I also have an add-on to this idea: if you don't want Chromium running 24/7, you can package a standalone offline package using Turbowarp Packager, and have cloud variables set to “Turbowarp's Online Server”. And perhaps pesterFirst of all, I THOUGHT OF THE EXACT SAME THING O_O Second, I do have a raspberry pi. A more detailed explanation:
There are four cloud variables: server_data, server_response?, client_data, and client_response?. When the server sends data, it sets server_data to the data it is sending. It also sets server_response? to 1. Meanwhile, the client (the people that are online) is constantly scanning for when server_response? equals 1. When it becomes 1, the client stores the data somewhere and sets server_response? to 0 so that it doesn't read the data multiple times. When the client sends data, the same thing happens, except it uses client_data and client_response?, and the server sets client_response? to 0 after reading the data in client_data.
Presumably, the server uses a list to store data like websites. The downsides are that there can only be 200,000 websites (Unless you are running it in turbowarp), that every website is lost when the server loses power, and that there is a limited amount of memory availible for websites (~1GB if using a Raspberry Pi 4 2GB).
-GarboMuffin- about adding cloud variables and locally stored variables to TurboWarp Desktop.
However, I believe that Scratchedia should be de-centralized, hence not running on a rasberry pi.
I'm good for whatver.
If you respond to something I write here, I probably won't see it, so let me know on my profile! Feel free to ask me questions about PHP, Python, Scratch, HTML, CSS, or JavaScript there, too!
- IGABMS
- Scratcher
100+ posts
Scratchedia - The Internet on Scratch, that's safe (Applications Closed)
This is what I've been saying, except without another central computer.Ha! That's funny! I also have an add-on to this idea: if you don't want Chromium running 24/7, you can package a standalone offline package using Turbowarp Packager, and have cloud variables set to “Turbowarp's Online Server”. And perhaps pesterFirst of all, I THOUGHT OF THE EXACT SAME THING O_O Second, I do have a raspberry pi. A more detailed explanation:
There are four cloud variables: server_data, server_response?, client_data, and client_response?. When the server sends data, it sets server_data to the data it is sending. It also sets server_response? to 1. Meanwhile, the client (the people that are online) is constantly scanning for when server_response? equals 1. When it becomes 1, the client stores the data somewhere and sets server_response? to 0 so that it doesn't read the data multiple times. When the client sends data, the same thing happens, except it uses client_data and client_response?, and the server sets client_response? to 0 after reading the data in client_data.
Presumably, the server uses a list to store data like websites. The downsides are that there can only be 200,000 websites (Unless you are running it in turbowarp), that every website is lost when the server loses power, and that there is a limited amount of memory availible for websites (~1GB if using a Raspberry Pi 4 2GB).
-GarboMuffin- about adding cloud variables and locally stored variables to TurboWarp Desktop.
This could be accomplished by having a data transfer variable and a request address
and the client is always checking for a request to its address and then sets the data transfer variable to the content.
If you respond to something I write here, I probably won't see it, so let me know on my profile! Feel free to ask me questions about PHP, Python, Scratch, HTML, CSS, or JavaScript there, too!
- xouzouris
- Scratcher
100+ posts
Scratchedia - The Internet on Scratch, that's safe (Applications Closed)
Ok, I have a Raspberry Pi if we need it.
>> Head of Scratchedia <<
Scratchedia is the recreation of the
internet, on Scratch, that's safe!
Random Stuff:
My little cousin asked me why we can't make and run a nuclear reactor at home…
Don't forget to keep on Scratchin'
- kkidslogin
- Scratcher
1000+ posts
Scratchedia - The Internet on Scratch, that's safe (Applications Closed)
kkidslogin wrote:
xouzouris wrote:
kkidslogin wrote:
A more detailed explanation:
There are four cloud variables: server_data, server_response?, client_data, and client_response?. When the server sends data, it sets server_data to the data it is sending. It also sets server_response? to 1. Meanwhile, the client (the people that are online) is constantly scanning for when server_response? equals 1. When it becomes 1, the client stores the data somewhere and sets server_response? to 0 so that it doesn't read the data multiple times. When the client sends data, the same thing happens, except it uses client_data and client_response?, and the server sets client_response? to 0 after reading the data in client_data.
Presumably, the server uses a list to store data like websites. The downsides are that there can only be 200,000 websites (Unless you are running it in turbowarp), that every website is lost when the server loses power, and that there is a limited amount of memory availible for websites (~1GB if using a Raspberry Pi 4 2GB).
First of all, I THOUGHT OF THE EXACT SAME THING O_O Second, I do have a raspberry pi.
Ha! That's funny! I also have an add-on to this idea: if you don't want Chromium running 24/7, you can package a standalone offline package using Turbowarp Packager, and have cloud variables set to “Turbowarp's Online Server”. And perhaps pester
-GarboMuffin- about adding cloud variables and locally stored variables to TurboWarp Desktop.
This is what I've been saying, except without another central computer.
This could be accomplished by having a data transfer variable and a request address
and the client is always checking for a request to its address and then sets the data transfer variable to the content.
Hmmm……….. Having the client host the website could be sketchy… When the client leaves the project or even his computer falls asleep, the website would be lost forever. So really, the client would need a raspberry pi running server mode as well, and preferably a separate account to run the server.
I forgot to quote your post so I had to do it manually
- GAMS2
- Scratcher
1000+ posts
Scratchedia - The Internet on Scratch, that's safe (Applications Closed)
Don't tell me I have to scrap the website…
Edit: Also the problem with that is if the ISP (Internet Service Provider) went down, the “server” has a blackout, the internet box needs a reset, or the internet starts having problem on the server, its screwed. Hey, even if the “server” breaks, its all screwed.
Edit: Also the problem with that is if the ISP (Internet Service Provider) went down, the “server” has a blackout, the internet box needs a reset, or the internet starts having problem on the server, its screwed. Hey, even if the “server” breaks, its all screwed.
Last edited by GAMS2 (May 17, 2021 15:44:09)
My signature is too bug, select the text and use Shift + Down arrow to see more!
Have I stopped recording?
No?
Then how do I turn this off?
Press the red button?
Okay I am pr-
Latest project:I finally feel…. appreciated….
If life throws something at you, try your best to dodge it, if it hits you and you fall…. get back up and keep moving forward. Don't give up, and make it through to the end. (MY OWN QUOTE!)
Do people still read these?
Nah, probably not.
I am now a furry. Drawings of my Fursona can be found here.
lol what have I done with my life? (Added on August 13th, 2020)
- IGABMS
- Scratcher
100+ posts
Scratchedia - The Internet on Scratch, that's safe (Applications Closed)
Wow! THat's a lot of work to quote it manually!kkidslogin wrote:
xouzouris wrote:
kkidslogin wrote:
A more detailed explanation:
There are four cloud variables: server_data, server_response?, client_data, and client_response?. When the server sends data, it sets server_data to the data it is sending. It also sets server_response? to 1. Meanwhile, the client (the people that are online) is constantly scanning for when server_response? equals 1. When it becomes 1, the client stores the data somewhere and sets server_response? to 0 so that it doesn't read the data multiple times. When the client sends data, the same thing happens, except it uses client_data and client_response?, and the server sets client_response? to 0 after reading the data in client_data.
Presumably, the server uses a list to store data like websites. The downsides are that there can only be 200,000 websites (Unless you are running it in turbowarp), that every website is lost when the server loses power, and that there is a limited amount of memory availible for websites (~1GB if using a Raspberry Pi 4 2GB).
First of all, I THOUGHT OF THE EXACT SAME THING O_O Second, I do have a raspberry pi.
Ha! That's funny! I also have an add-on to this idea: if you don't want Chromium running 24/7, you can package a standalone offline package using Turbowarp Packager, and have cloud variables set to “Turbowarp's Online Server”. And perhaps pester
-GarboMuffin- about adding cloud variables and locally stored variables to TurboWarp Desktop.
This is what I've been saying, except without another central computer.
This could be accomplished by having a data transfer variable and a request address
and the client is always checking for a request to its address and then sets the data transfer variable to the content.
Hmmm……….. Having the client host the website could be sketchy… When the client leaves the project or even his computer falls asleep, the website would be lost forever. So really, the client would need a raspberry pi running server mode as well, and preferably a separate account to run the server.
I forgot to quote your post so I had to do it manually
I believe that if we want it to be accessible, people shouldn't need a rasberry pi. This should also be easily copy-able (with credit, of course)
It's up to people to store it. Whenever someone leaves, it just frees up more space for other people and lessens the load Scratch has to deal with. My vote is that if you leave, your website is cleared.
And guys-quick reminder: this is the INTERNET IN SCRATCH- NOT the internet partially rooted in Scratch but also runs on a Node.js server and requires each user to have a $15 rasberry pu.
If you respond to something I write here, I probably won't see it, so let me know on my profile! Feel free to ask me questions about PHP, Python, Scratch, HTML, CSS, or JavaScript there, too!
- IGABMS
- Scratcher
100+ posts
Scratchedia - The Internet on Scratch, that's safe (Applications Closed)
You know what, why don't we just write this in Python and make this peer-to-peer with 256-bit asymmetric encryption and a remote data backup in 5 remote countries and have 3 more backups in a safety deposit box in a fortress?Wow! THat's a lot of work to quote it manually!kkidslogin wrote:
xouzouris wrote:
kkidslogin wrote:
A more detailed explanation:
There are four cloud variables: server_data, server_response?, client_data, and client_response?. When the server sends data, it sets server_data to the data it is sending. It also sets server_response? to 1. Meanwhile, the client (the people that are online) is constantly scanning for when server_response? equals 1. When it becomes 1, the client stores the data somewhere and sets server_response? to 0 so that it doesn't read the data multiple times. When the client sends data, the same thing happens, except it uses client_data and client_response?, and the server sets client_response? to 0 after reading the data in client_data.
Presumably, the server uses a list to store data like websites. The downsides are that there can only be 200,000 websites (Unless you are running it in turbowarp), that every website is lost when the server loses power, and that there is a limited amount of memory availible for websites (~1GB if using a Raspberry Pi 4 2GB).
First of all, I THOUGHT OF THE EXACT SAME THING O_O Second, I do have a raspberry pi.
Ha! That's funny! I also have an add-on to this idea: if you don't want Chromium running 24/7, you can package a standalone offline package using Turbowarp Packager, and have cloud variables set to “Turbowarp's Online Server”. And perhaps pester
-GarboMuffin- about adding cloud variables and locally stored variables to TurboWarp Desktop.
This is what I've been saying, except without another central computer.
This could be accomplished by having a data transfer variable and a request address
and the client is always checking for a request to its address and then sets the data transfer variable to the content.
Hmmm……….. Having the client host the website could be sketchy… When the client leaves the project or even his computer falls asleep, the website would be lost forever. So really, the client would need a raspberry pi running server mode as well, and preferably a separate account to run the server.
I forgot to quote your post so I had to do it manually
I believe that if we want it to be accessible, people shouldn't need a rasberry pi. This should also be easily copy-able (with credit, of course)
It's up to people to store it. Whenever someone leaves, it just frees up more space for other people and lessens the load Scratch has to deal with. My vote is that if you leave, your website is cleared.
And guys-quick reminder: this is the INTERNET IN SCRATCH- NOT the internet partially rooted in Scratch but also runs on a Node.js server and requires each user to have a $15 rasberry pu.
This doesn't have to be so complex!
I have repeatedly stated that we DO NOT NEED to leave Scratch for this, yet it still seems that everyone insists that we do!
If you respond to something I write here, I probably won't see it, so let me know on my profile! Feel free to ask me questions about PHP, Python, Scratch, HTML, CSS, or JavaScript there, too!
- NILL_GAMES10
- Scratcher
1000+ posts
Scratchedia - The Internet on Scratch, that's safe (Applications Closed)
Alright, lets have a poll tgen https://forms.gle/u4MeDkREDvx2AanE7You know what, why don't we just write this in Python and make this peer-to-peer with 256-bit asymmetric encryption and a remote data backup in 5 remote countries and have 3 more backups in a safety deposit box in a fortress?Wow! THat's a lot of work to quote it manually!kkidslogin wrote:
xouzouris wrote:
kkidslogin wrote:
A more detailed explanation:
There are four cloud variables: server_data, server_response?, client_data, and client_response?. When the server sends data, it sets server_data to the data it is sending. It also sets server_response? to 1. Meanwhile, the client (the people that are online) is constantly scanning for when server_response? equals 1. When it becomes 1, the client stores the data somewhere and sets server_response? to 0 so that it doesn't read the data multiple times. When the client sends data, the same thing happens, except it uses client_data and client_response?, and the server sets client_response? to 0 after reading the data in client_data.
Presumably, the server uses a list to store data like websites. The downsides are that there can only be 200,000 websites (Unless you are running it in turbowarp), that every website is lost when the server loses power, and that there is a limited amount of memory availible for websites (~1GB if using a Raspberry Pi 4 2GB).
First of all, I THOUGHT OF THE EXACT SAME THING O_O Second, I do have a raspberry pi.
Ha! That's funny! I also have an add-on to this idea: if you don't want Chromium running 24/7, you can package a standalone offline package using Turbowarp Packager, and have cloud variables set to “Turbowarp's Online Server”. And perhaps pester
-GarboMuffin- about adding cloud variables and locally stored variables to TurboWarp Desktop.
This is what I've been saying, except without another central computer.
This could be accomplished by having a data transfer variable and a request address
and the client is always checking for a request to its address and then sets the data transfer variable to the content.
Hmmm……….. Having the client host the website could be sketchy… When the client leaves the project or even his computer falls asleep, the website would be lost forever. So really, the client would need a raspberry pi running server mode as well, and preferably a separate account to run the server.
I forgot to quote your post so I had to do it manually
I believe that if we want it to be accessible, people shouldn't need a rasberry pi. This should also be easily copy-able (with credit, of course)
It's up to people to store it. Whenever someone leaves, it just frees up more space for other people and lessens the load Scratch has to deal with. My vote is that if you leave, your website is cleared.
And guys-quick reminder: this is the INTERNET IN SCRATCH- NOT the internet partially rooted in Scratch but also runs on a Node.js server and requires each user to have a $15 rasberry pu.
This doesn't have to be so complex!
I have repeatedly stated that we DO NOT NEED to leave Scratch for this, yet it still seems that everyone insists that we do!
I've moved accounts. Go to my new account here.
- Real_WeBino
- Scratcher
39 posts
Scratchedia - The Internet on Scratch, that's safe (Applications Closed)
Is this an OS? If not and it's a browser, then if you want it on an OS (which will hopefully one day be scripted on Visual Studio Code and added to HP, Dell, and chromebooks learn more here: Discover OS Forum)
I will be happy to add it
I will be happy to add it
Hello! I am Real_WeBino! The owner of WeBino!
link images will be here soon when this account turns to scratcher
Check out my other profiles! GuitarGuyPlayz MrCreeperFX
Current possible DEV: WeBino Discover 1 - The OS Of The Future!
Next DEV: T.B.A
3rd DEV: T.B.A
4th DEV: T.B.A
5th DEV: T.B.A
6th DEV: T.B.A
7th DEV: T.B.A
8th DEV: T.B.A
- kkidslogin
- Scratcher
1000+ posts
Scratchedia - The Internet on Scratch, that's safe (Applications Closed)
Hmmm… that makes something different than the internet… But it sounds cool. Like the internet before disk drives! I have a question though: are you guys going to make a text version of this browser before graphical? A graphical browser can be quite difficult. And don't forget that the internet just looked like reading a file from UXTerm for a while.Wow! THat's a lot of work to quote it manually!kkidslogin wrote:
xouzouris wrote:
kkidslogin wrote:
A more detailed explanation:
There are four cloud variables: server_data, server_response?, client_data, and client_response?. When the server sends data, it sets server_data to the data it is sending. It also sets server_response? to 1. Meanwhile, the client (the people that are online) is constantly scanning for when server_response? equals 1. When it becomes 1, the client stores the data somewhere and sets server_response? to 0 so that it doesn't read the data multiple times. When the client sends data, the same thing happens, except it uses client_data and client_response?, and the server sets client_response? to 0 after reading the data in client_data.
Presumably, the server uses a list to store data like websites. The downsides are that there can only be 200,000 websites (Unless you are running it in turbowarp), that every website is lost when the server loses power, and that there is a limited amount of memory availible for websites (~1GB if using a Raspberry Pi 4 2GB).
First of all, I THOUGHT OF THE EXACT SAME THING O_O Second, I do have a raspberry pi.
Ha! That's funny! I also have an add-on to this idea: if you don't want Chromium running 24/7, you can package a standalone offline package using Turbowarp Packager, and have cloud variables set to “Turbowarp's Online Server”. And perhaps pester
-GarboMuffin- about adding cloud variables and locally stored variables to TurboWarp Desktop.
This is what I've been saying, except without another central computer.
This could be accomplished by having a data transfer variable and a request address
and the client is always checking for a request to its address and then sets the data transfer variable to the content.
Hmmm……….. Having the client host the website could be sketchy… When the client leaves the project or even his computer falls asleep, the website would be lost forever. So really, the client would need a raspberry pi running server mode as well, and preferably a separate account to run the server.
I forgot to quote your post so I had to do it manually
I believe that if we want it to be accessible, people shouldn't need a rasberry pi. This should also be easily copy-able (with credit, of course)
It's up to people to store it. Whenever someone leaves, it just frees up more space for other people and lessens the load Scratch has to deal with. My vote is that if you leave, your website is cleared.
And guys-quick reminder: this is the INTERNET IN SCRATCH- NOT the internet partially rooted in Scratch but also runs on a Node.js server and requires each user to have a $15 rasberry pu.
- NILL_GAMES10
- Scratcher
1000+ posts
Scratchedia - The Internet on Scratch, that's safe (Applications Closed)
I have sn idea…Hmmm… that makes something different than the internet… But it sounds cool. Like the internet before disk drives! I have a question though: are you guys going to make a text version of this browser before graphical? A graphical browser can be quite difficult. And don't forget that the internet just looked like reading a file from UXTerm for a while.Wow! THat's a lot of work to quote it manually!kkidslogin wrote:
xouzouris wrote:
kkidslogin wrote:
A more detailed explanation:
There are four cloud variables: server_data, server_response?, client_data, and client_response?. When the server sends data, it sets server_data to the data it is sending. It also sets server_response? to 1. Meanwhile, the client (the people that are online) is constantly scanning for when server_response? equals 1. When it becomes 1, the client stores the data somewhere and sets server_response? to 0 so that it doesn't read the data multiple times. When the client sends data, the same thing happens, except it uses client_data and client_response?, and the server sets client_response? to 0 after reading the data in client_data.
Presumably, the server uses a list to store data like websites. The downsides are that there can only be 200,000 websites (Unless you are running it in turbowarp), that every website is lost when the server loses power, and that there is a limited amount of memory availible for websites (~1GB if using a Raspberry Pi 4 2GB).
First of all, I THOUGHT OF THE EXACT SAME THING O_O Second, I do have a raspberry pi.
Ha! That's funny! I also have an add-on to this idea: if you don't want Chromium running 24/7, you can package a standalone offline package using Turbowarp Packager, and have cloud variables set to “Turbowarp's Online Server”. And perhaps pester
-GarboMuffin- about adding cloud variables and locally stored variables to TurboWarp Desktop.
This is what I've been saying, except without another central computer.
This could be accomplished by having a data transfer variable and a request address
and the client is always checking for a request to its address and then sets the data transfer variable to the content.
Hmmm……….. Having the client host the website could be sketchy… When the client leaves the project or even his computer falls asleep, the website would be lost forever. So really, the client would need a raspberry pi running server mode as well, and preferably a separate account to run the server.
I forgot to quote your post so I had to do it manually
I believe that if we want it to be accessible, people shouldn't need a rasberry pi. This should also be easily copy-able (with credit, of course)
It's up to people to store it. Whenever someone leaves, it just frees up more space for other people and lessens the load Scratch has to deal with. My vote is that if you leave, your website is cleared.
And guys-quick reminder: this is the INTERNET IN SCRATCH- NOT the internet partially rooted in Scratch but also runs on a Node.js server and requires each user to have a $15 rasberry pu.
It invilves using a sofisticated UI system and hiring a very good programmer
I've moved accounts. Go to my new account here.
- xouzouris
- Scratcher
100+ posts
Scratchedia - The Internet on Scratch, that's safe (Applications Closed)
The scratch cloud variable servers are just not good enough! They are slow and unreliable!You know what, why don't we just write this in Python and make this peer-to-peer with 256-bit asymmetric encryption and a remote data backup in 5 remote countries and have 3 more backups in a safety deposit box in a fortress?Wow! THat's a lot of work to quote it manually!kkidslogin wrote:
xouzouris wrote:
kkidslogin wrote:
A more detailed explanation:
There are four cloud variables: server_data, server_response?, client_data, and client_response?. When the server sends data, it sets server_data to the data it is sending. It also sets server_response? to 1. Meanwhile, the client (the people that are online) is constantly scanning for when server_response? equals 1. When it becomes 1, the client stores the data somewhere and sets server_response? to 0 so that it doesn't read the data multiple times. When the client sends data, the same thing happens, except it uses client_data and client_response?, and the server sets client_response? to 0 after reading the data in client_data.
Presumably, the server uses a list to store data like websites. The downsides are that there can only be 200,000 websites (Unless you are running it in turbowarp), that every website is lost when the server loses power, and that there is a limited amount of memory availible for websites (~1GB if using a Raspberry Pi 4 2GB).
First of all, I THOUGHT OF THE EXACT SAME THING O_O Second, I do have a raspberry pi.
Ha! That's funny! I also have an add-on to this idea: if you don't want Chromium running 24/7, you can package a standalone offline package using Turbowarp Packager, and have cloud variables set to “Turbowarp's Online Server”. And perhaps pester
-GarboMuffin- about adding cloud variables and locally stored variables to TurboWarp Desktop.
This is what I've been saying, except without another central computer.
This could be accomplished by having a data transfer variable and a request address
and the client is always checking for a request to its address and then sets the data transfer variable to the content.
Hmmm……….. Having the client host the website could be sketchy… When the client leaves the project or even his computer falls asleep, the website would be lost forever. So really, the client would need a raspberry pi running server mode as well, and preferably a separate account to run the server.
I forgot to quote your post so I had to do it manually
I believe that if we want it to be accessible, people shouldn't need a rasberry pi. This should also be easily copy-able (with credit, of course)
It's up to people to store it. Whenever someone leaves, it just frees up more space for other people and lessens the load Scratch has to deal with. My vote is that if you leave, your website is cleared.
And guys-quick reminder: this is the INTERNET IN SCRATCH- NOT the internet partially rooted in Scratch but also runs on a Node.js server and requires each user to have a $15 rasberry pu.
This doesn't have to be so complex!
I have repeatedly stated that we DO NOT NEED to leave Scratch for this, yet it still seems that everyone insists that we do!
Last edited by xouzouris (May 17, 2021 19:28:13)
>> Head of Scratchedia <<
Scratchedia is the recreation of the
internet, on Scratch, that's safe!
Random Stuff:
My little cousin asked me why we can't make and run a nuclear reactor at home…
Don't forget to keep on Scratchin'