Best way to prevent supabase real-time from dying?

Do I have to send a ping like every 25 seconds or something to keep the socket alive? Or is there a better way of ensuring it doesnt disconnect from inactivity
53 Replies
garyaustin
garyaustin3w ago
If your tab goes into background or device puts the browser/app id low power mode or background then the times slow down and it will disconnect.
garyaustin
garyaustin3w ago
Towards the end of this is a newer feature that uses a webworker to keep it alive. I've not seen it documented yet. https://github.com/supabase/realtime-js/issues/121 But in reality you will likely need to deal with connection failure and use visibility to decide whether to bother to reconnect on error or wait and reconnect when the tab is visible again. If you need all changes/messages then you have to deal with errors, restart and collect missed data anyway as there is no queue.
GitHub
Realtime websocket loses connection regularly when browser tab goes...
Bug report Describe the bug With a simple subscription to a table (RLS or Not) the websocket drops and reconnects every 3 minutes after 5 minutes of tab in background. Verifying on Windows 10 with ...
Tariq
TariqOP3w ago
lmao this seems to be a common issue im not really familiar with webworkers but i think ur right about only reconnecting when tab is active
garyaustin
garyaustin3w ago
It also saves expensive connection which is a billing metric.
Tariq
TariqOP3w ago
true
const client = new RealtimeClient(
"wss://PROJECT_URL.supabase.co/realtime/v1",
{
worker: true,
heartbeatIntervalMs: 15000,
logger: console.log,
params: {
apikey: "API_KEY",
},
}
);
const client = new RealtimeClient(
"wss://PROJECT_URL.supabase.co/realtime/v1",
{
worker: true,
heartbeatIntervalMs: 15000,
logger: console.log,
params: {
apikey: "API_KEY",
},
}
);
was this the webworker code you were referring to?
garyaustin
garyaustin3w ago
Could be. I've not tried it as I went with polling for some stuff and will use broadcast from the DB if I get working again on it. Then I'll use either the message table as the queue to fetch missed messages or my own intermediate table if the original table does not have a date to fetch from. I'll keep a local storage time of last realtime event and on any error or a few minutes of background shut down until visible again and reconnect and then fetch what I missed. Even with the webworker if you lose connection (wifi/phone network drops) and realtime reconnects even a few seconds later you could have missed data.
Tariq
TariqOP3w ago
yeah I saw you mention that in one of the github issues thats not a problem bc I already have tanstack query set up to fetch the initial messages
garyaustin
garyaustin3w ago
Realtime works better for state monitoring, where if you miss a state, you pick it back up on the next one.
Tariq
TariqOP3w ago
Im just wondering the best way of writing the logic of restablishing the websocket connection filipecabaco closed the issue and says its released in 2.10.7 so i assumed that this was no longer a problem or that there was a built-in solution
garyaustin
garyaustin3w ago
There is no built in. I believe you have to know to use that flag, unless I've missed something. No docs on it that I found. And it does not deal with reconnect, just tries to avoid disconnecting.
Tariq
TariqOP3w ago
so what exactly did he mean by "This has been released in 2.10.7"? yeah well that doesnt seem to work either
garyaustin
garyaustin3w ago
https://github.com/GaryAustin1/Realtime2 and the linked discussion in that talk about/show what I looked at. But more just experiments to find the issues. I assume he means the webworker flag is now available. I just added it to my realtime test code that I always have running in chrome that fails in background tab after several minutes. I'll let you know if it keeps it alive.
Tariq
TariqOP3w ago
alright
garyaustin
garyaustin3w ago
I added this to supabase-js createClient and have had no errors in background in about 40 minutes. Usually there are errors in about 3 or 4 minutes.
No description
Tariq
TariqOP3w ago
Ok thats promising this code is only responsible for keeping the socket open, right?
garyaustin
garyaustin3w ago
I assume that if my connection dropped the websocket itself would fail and reconnect but I don't know that for sure. Normally the heartbeat gets skipped and that is the first the server seems to know of a bad connection.
Well it avoids using the slowed down timers to keep the heartbeat alive. The socket does not drop in the background... the heartbeat slows down to the point it misses.
Tariq
TariqOP3w ago
I was using one of the code blocks provided that reconnects when connection is lost based on tab visibility, but now sending a message takes an extra ~3 seconds:
useEffect(() => {
let subscription: RealtimeChannel

const setupRealtimeSubscription = async () => {
await unsubscribeRealtimeConnection()
subscription = supabase
.channel('global_messages')
.on(
'postgres_changes',
{
event: 'INSERT',
schema: 'public',
table: 'global_messages',
},
(payload) => {
const newMessage = payload.new;
queryClient.setQueryData(["global_messages"], (oldData: any) => {
if (!oldData) return [newMessage];
return [...oldData, newMessage];
});
}
)
.subscribe((status) => {
console.log('Global messages listener...', status)
})
}

const unsubscribeRealtimeConnection = async () => {
if (subscription) {
const message = await subscription.unsubscribe()
console.log(`${message} - Global messages listener removed.`)
}
}

const handleVisibilityChange = async () => {
if (document.visibilityState === 'visible') {
console.log('Tab is visible again.')
if (subscription.state === 'closed') {
console.log('SUBSCRIPTION IS CLOSED.')
// Token refesh is important to prevent prevent reconnection failure
const { data } = await supabase.auth.getSession()
if (data.session) {
supabase.realtime.setAuth(data.session?.access_token)
setupRealtimeSubscription()
}
}
}
}

// Set up initial subscription
setupRealtimeSubscription()

// Listen for visibility changes
document.addEventListener('visibilitychange', handleVisibilityChange)

// Cleanup
return () => {
unsubscribeRealtimeConnection()
document.removeEventListener('visibilitychange', handleVisibilityChange)
}
}, [])
useEffect(() => {
let subscription: RealtimeChannel

const setupRealtimeSubscription = async () => {
await unsubscribeRealtimeConnection()
subscription = supabase
.channel('global_messages')
.on(
'postgres_changes',
{
event: 'INSERT',
schema: 'public',
table: 'global_messages',
},
(payload) => {
const newMessage = payload.new;
queryClient.setQueryData(["global_messages"], (oldData: any) => {
if (!oldData) return [newMessage];
return [...oldData, newMessage];
});
}
)
.subscribe((status) => {
console.log('Global messages listener...', status)
})
}

const unsubscribeRealtimeConnection = async () => {
if (subscription) {
const message = await subscription.unsubscribe()
console.log(`${message} - Global messages listener removed.`)
}
}

const handleVisibilityChange = async () => {
if (document.visibilityState === 'visible') {
console.log('Tab is visible again.')
if (subscription.state === 'closed') {
console.log('SUBSCRIPTION IS CLOSED.')
// Token refesh is important to prevent prevent reconnection failure
const { data } = await supabase.auth.getSession()
if (data.session) {
supabase.realtime.setAuth(data.session?.access_token)
setupRealtimeSubscription()
}
}
}
}

// Set up initial subscription
setupRealtimeSubscription()

// Listen for visibility changes
document.addEventListener('visibilitychange', handleVisibilityChange)

// Cleanup
return () => {
unsubscribeRealtimeConnection()
document.removeEventListener('visibilitychange', handleVisibilityChange)
}
}, [])
oh i see
garyaustin
garyaustin3w ago
Not sure what sending message means in this context.
Tariq
TariqOP3w ago
ITs what Im using the real-time for, sending a message in a global chat channel
Tariq
TariqOP3w ago
No description
garyaustin
garyaustin3w ago
With postgres_changes there is a 100's to several second delay from SUBSCRIBED to the point it connects to the DB and would notice any changes. Any changes between you starting the subscription code until a status comes back (not SUBSCRIBED) saying postgres_changes connected would be missed. Not sure once connected why there would be a delay.
No description
Tariq
TariqOP3w ago
well the delays only occurring now after using this useEffect code snippet
garyaustin
garyaustin3w ago
const mySubscription2 = supabases[2]
.channel('mychannel2')
.on('postgres_changes',
{ event: '*', table: 'cat'}, async payload => {
console.log('Change received1!', payload)
})
.subscribe((status,err)=>{
console.log('subscribe = status',status,err);
})
.on('system', {}, async payload => {
console.log('system',payload)
})
const mySubscription2 = supabases[2]
.channel('mychannel2')
.on('postgres_changes',
{ event: '*', table: 'cat'}, async payload => {
console.log('Change received1!', payload)
})
.subscribe((status,err)=>{
console.log('subscribe = status',status,err);
})
.on('system', {}, async payload => {
console.log('system',payload)
})
is my test subscription. The .on('system') detects when you really start getting data. The initial connection delay where it is not really connected is from .subscribe() until the .on('system') payload message with the 'postgres_changes' message in it. That is the DB connecting to the realtime server. SUBSCRIPTION is just your code connecting to the REALTIME server. Another reason to go to broadcast changes is to avoid all of this.
Tariq
TariqOP3w ago
Oh i see what you mean I think, I sent two messages back to back and the second one was a lot quicker than the first what do you mean
garyaustin
garyaustin3w ago
But it does not "delay" messages. They just won't be received until the 2nd event. Supabase recommends going to Broadcast from the server instead of postgres_changes now. They have a shell built around that they call broadcast_changes to simulate this but it is just using private channel Broadcast messages from a trigger function on the table you want to monitor.
Tariq
TariqOP3w ago
ok hold on let me get the payload from system instead and see what happens oh
Tariq
TariqOP3w ago
this whole thing is just about the most complex thing ive encountered
Tariq
TariqOP3w ago
I thought this solution would work but now its telling me I have duplicate keys
No description
garyaustin
garyaustin3w ago
Very much. Is that a Supabase error or a next.js error? I don't use next.js.
Tariq
TariqOP3w ago
its a React error its referring to the .map of the messages
garyaustin
garyaustin3w ago
Do you have the same message twice in the array?
Tariq
TariqOP3w ago
Im assuming so bc it appears twice in the chatbox it disappears when I reload the page I think its because Im manually appending the payload to the data fetched with react-query
garyaustin
garyaustin3w ago
So you might be fetching your past data and then when realtime connects you get the last piece of data again or the opposite. A race condition.
Tariq
TariqOP3w ago
that makes sense
garyaustin
garyaustin3w ago
If you have a key you should probably be inserting the realtime time and not appending so it just replaces if already exists.
Tariq
TariqOP3w ago
Ok so I replaced
(payload) => {
const newMessage = payload.new;
queryClient.setQueryData(["global_messages"], (oldData: any) => {
if (!oldData) return [newMessage];
return [...oldData, newMessage];
});
}
(payload) => {
const newMessage = payload.new;
queryClient.setQueryData(["global_messages"], (oldData: any) => {
if (!oldData) return [newMessage];
return [...oldData, newMessage];
});
}
with
(payload) => {
queryClient.invalidateQueries({ queryKey: ["global_messages"] });
}
(payload) => {
queryClient.invalidateQueries({ queryKey: ["global_messages"] });
}
but now its consistently slow because invalidateQueries is refetching the data from the database whenever theres a new payload
garyaustin
garyaustin3w ago
All of the data?
Tariq
TariqOP3w ago
yes
garyaustin
garyaustin3w ago
At least use a last key to only fetch > than that. I've not seen any good examples of dealing with this stuff and you are into the realm of there are many different approaches and differences of how to deal with it. It gets more complex if there are deletes involved.
garyaustin
garyaustin3w ago
In my repository I have code like this to handle keeping an in memory array of records updated.
No description
garyaustin
garyaustin3w ago
But that dealt with all three types of changes to a table.
Tariq
TariqOP3w ago
and you initally populated this table by fetching the data first?
garyaustin
garyaustin3w ago
It is in the repository but I fetch the data AFTER postgres changes is connected to init the table. There could still be data coming thru realtime though even if in the initial data so I make sure to overright versus just add.
Tariq
TariqOP3w ago
yeah I saw you saying that doing it in that order prevents the user from missing data yeah exactly URGHH think ima just work on this again tomorrow I might do it your way with the memoryTable bc I do plan on adding edits and deletes later what is the memory table anyway? Is it a table in Redis, local storage?
garyaustin
garyaustin3w ago
Is the data something used all the time by the user or only on demand? Like they go to see it in a tab?
Tariq
TariqOP3w ago
on demand im guessing, they're only going to need the global chat when its in view or they're typing in it
garyaustin
garyaustin3w ago
That one is just memory. For testing. My real app used indexedDB to store records and would get updated.
Tariq
TariqOP3w ago
im assuming that indexedDB isnt remote
garyaustin
garyaustin3w ago
But I only kept the past 100 records for my purposes. If you searched the data it went to the DB. Right is is part of browsers.
Tariq
TariqOP3w ago
oh i didnt know that so would it be better to use that or redis in my use case im not familiar with either
garyaustin
garyaustin3w ago
https://developer.mozilla.org/en-US/docs/Web/API/IndexedDB_API Lots of shims for it.
I don't know really about redis. There are many methods out there to handle it. But for my case it had to have your last bit of records for offline use. And you could change them on another device so needed to sync them. I also was using realtime for a chat feature with rooms. But quickly went to only turning on realtime when you were in the chatroom and polling to detect any change to avoid connections if someone just had the app on. This was all before broadcast messages with security which is the way I'll go now if I get back to it.
Tariq
TariqOP3w ago
Why there doesn't seem to be a guide for this whole thing is beyond me

Did you find this page helpful?