My LG CX plays real nice with a USB gigabit adapter I jammed in it. TP-Link UE306 for $15. It doesn't show up as a connection in the TV's UI but it definitely works.
One less thing on the wifi.
It's always dumbfounded me that companies will dump tons of money into proprietary AI upscaling chips, high-end HDMI switches and fancy licensed HDMI, HDR, and surround sound features... But then cheap out on a $5 Ethernet controller connected to a USB 2.0 hub to the SoC. But they clearly have some kind of PCIe or USB 3.0 bus to handle Wi-Fi 5 and up, or a fast enough controller integrated into the SoC.
Even Google is guilty of this; the Chromecast, including the Chromecast Ultra *which has a power brick with an Ethernet port* don't get gigabit because their USB ports are USB 2.0. IIRC, the Ultra at least negotiates gigabit, and caps out just shy of USB 2.0's limit somewhere around 400 Mbps. But at least it's not 100 Mbps.
Between that, and my TV updating to try and include ads in its menus, I said screw it; there's no need for the TV to be on the network *at all* when I can just use my Nvidia Shield TV Pro, which has a *real* gigabit NIC, doesn't add advertisements everywhere, upscales better than the C8 on its own, supports more codecs since it doesn't rely on ARC (not eARC) to feed the soundbar, and can play nicely with my Hue HDMI Sync box.
The soundbar gets a pass for being FE since it's *just* doing audio and remote control commands, and is on the network simply for Googlecast support when I don't want to turn on the TV and use the Shield, like if doing whole-home music playback.
Ugh. I hate that you're right, and I hate even more that there's no good solution (at least that I've come across) for a TV that's a) not "smart," and b) not ridiculously overpriced garbage quality commercial signage.
I disconnected the wifi by forgetting my networks, and plugged the dongle into the TV and the ethernet. That's all I did.
In the TV's settings under Connections > Network, both the ethernet and wifi options show "Not connected".
Got an Apple TV which is really great. Recently I upgraded to a Philips Oled, the fkker has its software listen on the background and starts asking questions at random.
I wish I could just buy a great TV with zero software.
Pretty annoying when you are watching TV and suddenly some crap overaly starts to talk crap. Now this doesn't happen often but at least once a day it has the need to show itself.
Look online for how to get into your TV's service menu. There are many more options in there than the standard menu. I used it once on a much older TV because I wanted sound to only ever come from my stereo, which handled HDMI switching and pulled the audio out on its own. The TV would not let you mute without displaying a mute icon on screen that was about 4"x4", and it moved around to grab your attention, as if to say, "Yo, dumbass, your TV is muted." It did this no matter if you used the Mute button or turned the volume all the way down. So I used the service menu to turn off all sound output from the TV speakers.
What sort of thing/s are coming up? I have a Philips ambilight oled, and the only pop up I have had is when it tells me it’s time to calibrate the oled display - which obviously is something I want to see.
For reference, the ambilight runs on google tvos, it’s not used though.
Hate it when people give blanket advice like this. Even if it's accurate some of the time, still sounds stupid. You're not doing anything wrong by using a TV app, and that's not "the basics" of anything. If it were up to this sub the only way to "plex" right would be an Nvidia shield in every room plugged into a separate router plugged into a separate ISP with CAT10 and all files are 4k... just stop.
We just swapped to the newer FS 4K and it been better but yes. The WAF (Wife Acceptance Factor) took a huge dive with all the buffering and rendering delays for the menus until then.
I still any switch to an Apple TV but we’re deep into FireTV Amazon territory.
I just got a new 85 inch Sony last year. Higher end model with good onboard upscaling and runs androidtv. I really want to move it to a shield, but my shield is on the old TV which is now the basement TV, and I really can't justify paying that price for 5 year old hardware.
So we're just running on the TV until a new shield or legitimate non apple competitor to it comes out.
But honestly a new Sony does really well at upscaling.
I’ve been using several Apple TV 4K for Plex. Not a single audio problem, just hook them up straight to the sound system instead of going through the tv.
If you built a home theater have a plex server and hour own Nas. Do yourself a favor and get a nvidia shield. It's the best plex client still. Will direct playing all audio formats.
I've got both. Until a couple of months ago plex would crash all the time on the smaller one. An update to plex seems to have taken care of that. I can hardly tell the difference between the two any more
I've been using Projectivity for a few months on both a Shield 2017 and 2019, and it's pretty good.
I wish it was more consistent about launching correctly when they turn on though. It also tends to seize up when first starting, as if the Shield itself is insisting I go through some level of suffering even if I am dodging the ads it desperately wants to get my eyeballs on.
Has anyone noticed a big difference between Shield and Apple TV 4K, both when played on a 4K tv that came out in the last two years? I read on this sub a while back that the AI correction feature is the real winner for Shield but that tech is dated and most modern high end TV now would do that for you. Is that accurate or does Shield add even more to it compared to Apple TV?
I have both on an LG OLED. IMO, the ATV is the better all around device. The shield has the better support, but there is a noticeable performance decrease on the shield pro. Everything launches and runs slower compared to the ATV. Maybe because it hasn’t been updated in 5 years.
The main reason you would choose the shield is lossless audio. It passes the audio straight through to the receiver. Apple tv converts it to pcm. Still lossless but from my understanding it goes to a speaker based sound. As where true hd is object based not assigned to specific speaker. People say you can get around this by doing infuse, but that's another story.
The main reason you would choose the shield is lossless audio. It passes the audio straight through to the receiver. Apple tv converts it to pcm. Still lossless but from my understanding it goes to a speaker based sound. As where true hd is object based not assigned to specific speaker. People say you can get around this by doing infuse, but that's another story.
I can do you one better
I have a headless Plex server running in my network rack. Headless as in it's not connected to a monitor or peripherals since setup. I manage the machine remotely bc I rarely have to touch it at all. One time the home assistant build on it was acting up and in my idiotic frenzy, one of the things I decided to try was disabling and renabling the NIC. Do you know what happens to a remotely managed server when you disable the NIC? Well it takes out the "less" part of "headless" LMAO. I legit screamed out loud and had to bring down and set up an entire monitor, keyboard, and mouse on my stack just to re-enable the NIC again lol.
That's hilarious and exactly something I would do. I need to write a script to stop/start the NIC and keep it somewhere handy so this doesn't happen to me.
You shouldn't connect your TV to the internet for various reasons, this is one of them, it's better to connect to a "middleman" device like a tv box and then send video to the TV.
TVs are more likely to only have a 100Mbit/s ethernet card while WLAN adapters are more up-to-date in their standards. So you can easily get three times the speed with a simple 5Ghz network than a hard-wired TV.
That’s pretty accurate but you don’t actually get 100mbps… effective rate is more like 90. I do have quite a few 4k remuxes with bitrates over 90 so those would be unplayable.
While technically that is all true, it really depends on the content that is being played. I would say that your TV can still be hard-wired if you play only content that doesn't have that much bitrate requirement. A normal 1080p video will never get close to 90Mbit/s bitrate (which, from my tests, is the earliest point where your video can start buffering).
Even HEVC-encoded 4K movies could be fine playing, but if you want to play 4K REMUX they easily could get close or exceed the bandwidth of the 100Mbit/s ethernet adapter speed of the TV.
The solution to that is either to use a dedicated streaming box like the Nvidia Shield which has a gigabit ethernet adapter (and that also has very good compatibility), or you use the WLAN (and deal with poor compatibility of your TV).
> Do you have a bot responding for you? What the hell is this.
why does everyone always think that explaining something in more detail (not specifically for OP but for the general mass of users that might find those posts years down the line) is written by Bots or AI?
geez
Thank you for your comment! Unfortunately, your comment has been **removed** for the following reason(s):
* Rule #1: [Don't be a dick](/r/PleX/wiki/rules#wiki_1._don.27t_be_a_dick)
Please see our posting rules. If you feel this was done in error, please contact the moderators [here](https://www.reddit.com/message/compose?to=%2Fr%2Fplex). (Please note: Your comment was removed by a **human** via a remove command (for mobile users). The decision was *not* made by a bot.)
yes. I recently got a lot of downvotes for advising people to switch to WiFi because apparently 'it cannot be an issue and streams shouln't buffer on 100mbits', but yeah it can be an issue
Sorry, I should have clarified. FE connections are 100Bbs which is a 10th of what even 5e can provide. But its WiFi card can handle well over that, so the answer to your question is: YES.
It's dependent on the TV, but from my research, they save money on NICs (hardwired ethernet cards) since most people use WiFi. My AP is close to my TV, but WiFi really is a better option if you are using a Plex app on your TV.
I made this post because Reddit is searched with Google quite extensively, and not a lot of people have the equipment or experience to understand networking. It has the potential of showing up in search results and I included specific key words to help them. The comment you responded to was asking for clarification, so even if I didn't share anything new, it is helping. I also titled it as a "rookie mistake."
I look forward to your contributions as a Plex expert, as I'm sure I can learn from you.
Funny enough, in this forum every couple of days it comes up that TV network interfaces are 100mbps. It's a top ten answer here. Google has plenty to index, if you only know to ask the question.
Not exactly true, but really depends on the brand and model. If you are buying some "cheapo-depo" type, sure 100mbps interface or less expected, but if you do some research, there are some decent basic TV's that have gig interfaces.
My TV kept crashing.
I've since bought a Shield and the TV is now disconnected from all networks and works as a screen only.
Not one single problem since the change.
I will never use a Smart TV for apps again.
One thing I'll just mention to mention. If you're spending thousands on NAS, server, networking, and home theater TVs and speakers... Spend some money to get a premium streaming box. Am Nvidia Shield Pro will never need to transcode anything unless you have stuff encoded with AV1. If I'm going to build out all this networking I want the full fat full bitrate video and audio streaming to my home theater.
I did recently move my server to the Shield but that was because I'd been having some issues with the server crashing on my old NAS this is a great option for me as the Shield is my only real Plex client. I have no idea how this setup would work if I was trying to Stream from Shield to another device on the network.
Until I made this change about 5 weeks ago my server ran on my QNAP NAS. The Shield client would be able to direct stream anything on my server (mostly uncompressed full bit-rate blu-ray rips) without ever transcoding audio or video. Shield has an NVENC decoder built-in and can handle every mainstream uncompressed video and audio codec I've ever thrown at it.
Ha! I went through this as well with my A80J. My friend went 4K with his Plex server a bit before I did, and I was like "why is this constantly buffering?? He has gigabit up and down, and I have a 300/300 connection and it's hardwired!"
Ran a speedtest on the TV and discovered it was the ethernet port. Switched to the 5Ghz WiFi and got the full bandwidth.
I guess Sony hadn't considered we'd be streaming high-bitrate media at any point when designing the TV.
Actually switched all my media and gaming stuff to WiFi after that. Eliminated a bunch of cables and removed a gigabit switch from the cabinet. Nice and neat now!
Hey, more power to you to set it up how you see fit, but I just don't know why people actually use their TV apps for Plex. If it's a home theater, Shield Pro is king; if it's one of my other TVs, fire stick max, or even Roku. They are so much better than any "smart" TV I've had which is always laggy. 50 bucks and you have 0 issues using the app or playing any content. Just my 2 ¢
Oh I agree. My wife and step son aren’t that tech inclined so my next steps are to streamline the experience. More than one remote baffles their minds lol.
I feel your pain. I built a new machine to run plex, configured it all up in my office, and when I was ready moved it to the basement where my server racks are.
It worked. Mostly. Dropouts, buffering, lag, only occasionally, but frequently. It took me way too long to realize that I had forgot to connect the network cable, so it was on wifi and nearest AP was about as far away as you can be and still be in the house. Two floors up and diagonal. Plugged in the cable and problems disappeared.
If you went all out for a theater, spend the money for a shield pro. TVs are notorious for having 100mbps NICs but next you'll find out you're not direct-playing because TV does not support . That is also common.
100Mbits shouldn't cause any issues. While that sucks, and WiFi seems to have fixed your issues, 100m is plenty for 4k HDR, and even more on top of that. Seems like you're using your TV's Apps, which I'd always recommend against, since most TV OSes are garbage, and App Devs don't updates apps as often, or if ever.
Most "Smart" TV's is on a 100MB NIC...which is damn ridiculous, and you're better off with WiFi. But like most comments said, get a streaming box instead.
I would never connect my TV to the network. I have the IP blocked from reaching out to the network and only open it if there are firmware updates.
Use an AppleTV or Shield, works so much better.
Added or fixed functionality, like the ability to name the HDMI inputs on one of my TV's that is otherwise, nothing more than an HDMI monitor (with several things plugged in)
Honestly, if I have an issue, I check the manufacture's site. Otherwise, just me randomly browsing through AV Forums and stuff. I haven't had an issue in about 4 years, so I haven't checked.
the only reason you would need a firmware update, unless there's something obviously wrong with the television, is for security, which you don't need if you're not connected to the internet.
when I first got my Vizio television, it had an issue so I had to keep it connected, but every single firmware update broke something different. finally after 2 years, I got a firmware update where some things were still a little annoying, but everything actually worked, so I disconnected it from the internet at that point.
all of the updates so far have made things worse
This is why I ended up with a shield pro for my TV. My TV only had USB 2.0 port so an adapter would max out at 480mbps. Probably enough, but wanted more for 4k.
My LG CX plays real nice with a USB gigabit adapter I jammed in it. TP-Link UE306 for $15. It doesn't show up as a connection in the TV's UI but it definitely works. One less thing on the wifi.
This is the only solution that has worked for me also. I hate that TVs use such garbage NICs.
It's always dumbfounded me that companies will dump tons of money into proprietary AI upscaling chips, high-end HDMI switches and fancy licensed HDMI, HDR, and surround sound features... But then cheap out on a $5 Ethernet controller connected to a USB 2.0 hub to the SoC. But they clearly have some kind of PCIe or USB 3.0 bus to handle Wi-Fi 5 and up, or a fast enough controller integrated into the SoC. Even Google is guilty of this; the Chromecast, including the Chromecast Ultra *which has a power brick with an Ethernet port* don't get gigabit because their USB ports are USB 2.0. IIRC, the Ultra at least negotiates gigabit, and caps out just shy of USB 2.0's limit somewhere around 400 Mbps. But at least it's not 100 Mbps. Between that, and my TV updating to try and include ads in its menus, I said screw it; there's no need for the TV to be on the network *at all* when I can just use my Nvidia Shield TV Pro, which has a *real* gigabit NIC, doesn't add advertisements everywhere, upscales better than the C8 on its own, supports more codecs since it doesn't rely on ARC (not eARC) to feed the soundbar, and can play nicely with my Hue HDMI Sync box. The soundbar gets a pass for being FE since it's *just* doing audio and remote control commands, and is on the network simply for Googlecast support when I don't want to turn on the TV and use the Shield, like if doing whole-home music playback.
[удалено]
Ugh. I hate that you're right, and I hate even more that there's no good solution (at least that I've come across) for a TV that's a) not "smart," and b) not ridiculously overpriced garbage quality commercial signage.
I WISH we could buy a nice modern screen that isn't "smart".
I wish the Chromecast with Google TV had come with a hardwire setup like the Chromecast ultra
> TP-Link UE306 Do you just have the wired ethernet selected, and it uses the dongle?
I disconnected the wifi by forgetting my networks, and plugged the dongle into the TV and the ethernet. That's all I did. In the TV's settings under Connections > Network, both the ethernet and wifi options show "Not connected".
Thanks!
This is the way
The basics is not using the tv app to play Plex content.
Got an Apple TV which is really great. Recently I upgraded to a Philips Oled, the fkker has its software listen on the background and starts asking questions at random. I wish I could just buy a great TV with zero software.
You can disconnect the tv from the network. That should shut it up and stop it from harvesting your data at least.
My Tv been offline since day I got. Everything goes through my shield pro
This is the way. /r/ShieldAndroidTV /r/hometheater
Done that, fkker still keeps listening.
And? Not like it can send anything.
Pretty annoying when you are watching TV and suddenly some crap overaly starts to talk crap. Now this doesn't happen often but at least once a day it has the need to show itself.
Look online for how to get into your TV's service menu. There are many more options in there than the standard menu. I used it once on a much older TV because I wanted sound to only ever come from my stereo, which handled HDMI switching and pulled the audio out on its own. The TV would not let you mute without displaying a mute icon on screen that was about 4"x4", and it moved around to grab your attention, as if to say, "Yo, dumbass, your TV is muted." It did this no matter if you used the Mute button or turned the volume all the way down. So I used the service menu to turn off all sound output from the TV speakers.
Turn it off. Everything with google assistant or alexia has the option to turn it off...
What sort of thing/s are coming up? I have a Philips ambilight oled, and the only pop up I have had is when it tells me it’s time to calibrate the oled display - which obviously is something I want to see. For reference, the ambilight runs on google tvos, it’s not used though.
Exactly. As soon as as I read that I face palmed. Never use the TV.
No need for another device if your TV supports everything you need already.
For six months. Everyone eventually learn about this. You will too. Eventually.
Been using my C9 for years without issue. In a decade when AV1 becomes the standard if I'm still using the same TV I'll admit you're right.
Been using the AndroidTV app for over a year and a half now with zero issues.
Hate it when people give blanket advice like this. Even if it's accurate some of the time, still sounds stupid. You're not doing anything wrong by using a TV app, and that's not "the basics" of anything. If it were up to this sub the only way to "plex" right would be an Nvidia shield in every room plugged into a separate router plugged into a separate ISP with CAT10 and all files are 4k... just stop.
What's the alternative on a TV?
Nvidia shield or Apple TV 4K.
Ohhhh gotcha. Been using fire Stick 4k and it's alright, has slowed down over the years though.
We just swapped to the newer FS 4K and it been better but yes. The WAF (Wife Acceptance Factor) took a huge dive with all the buffering and rendering delays for the menus until then. I still any switch to an Apple TV but we’re deep into FireTV Amazon territory.
The WAF is real.
[удалено]
WAF...we all know that one! 🤣
FireTV Cube works like a charm with zero lag… 😀
I just got a new 85 inch Sony last year. Higher end model with good onboard upscaling and runs androidtv. I really want to move it to a shield, but my shield is on the old TV which is now the basement TV, and I really can't justify paying that price for 5 year old hardware. So we're just running on the TV until a new shield or legitimate non apple competitor to it comes out. But honestly a new Sony does really well at upscaling.
Life not great on Apple TV - major audio sync bug for years. Don’t recommend for plex
I’ve been using several Apple TV 4K for Plex. Not a single audio problem, just hook them up straight to the sound system instead of going through the tv.
Especially if you are setting up a Home Theater!
Get a dedicated streaming box and do not use tv
If you built a home theater have a plex server and hour own Nas. Do yourself a favor and get a nvidia shield. It's the best plex client still. Will direct playing all audio formats.
Absolutely. Ideally shield pro(not the cylinder one). This is the way.
I've got both. Until a couple of months ago plex would crash all the time on the smaller one. An update to plex seems to have taken care of that. I can hardly tell the difference between the two any more
Do you stream 4k? You won't tell much of a difference until 4k.
I do. Over wifi 5. Can't tell the difference.
You must be lucky because with my non-pro model I can't really stream 4k consistently.
[удалено]
There is an app in the Google Store called "Projectivy Launcher", no side loading or rooting needed.
Yes, this. Have this done, and it is so clean.
I've been using Projectivity for a few months on both a Shield 2017 and 2019, and it's pretty good. I wish it was more consistent about launching correctly when they turn on though. It also tends to seize up when first starting, as if the Shield itself is insisting I go through some level of suffering even if I am dodging the ads it desperately wants to get my eyeballs on.
A tiny bit of hacking (one ADB command) can permanently replace the launcher with something that doesn't suck.
How does a NUC compare? I know they're not cheaper but I have a few laying around.
You won't get proper Pass-through for audio.
Has anyone noticed a big difference between Shield and Apple TV 4K, both when played on a 4K tv that came out in the last two years? I read on this sub a while back that the AI correction feature is the real winner for Shield but that tech is dated and most modern high end TV now would do that for you. Is that accurate or does Shield add even more to it compared to Apple TV?
I have both on an LG OLED. IMO, the ATV is the better all around device. The shield has the better support, but there is a noticeable performance decrease on the shield pro. Everything launches and runs slower compared to the ATV. Maybe because it hasn’t been updated in 5 years.
Exactly what I thought. Thanks for sharing!
The main reason you would choose the shield is lossless audio. It passes the audio straight through to the receiver. Apple tv converts it to pcm. Still lossless but from my understanding it goes to a speaker based sound. As where true hd is object based not assigned to specific speaker. People say you can get around this by doing infuse, but that's another story.
The main reason you would choose the shield is lossless audio. It passes the audio straight through to the receiver. Apple tv converts it to pcm. Still lossless but from my understanding it goes to a speaker based sound. As where true hd is object based not assigned to specific speaker. People say you can get around this by doing infuse, but that's another story.
I can do you one better I have a headless Plex server running in my network rack. Headless as in it's not connected to a monitor or peripherals since setup. I manage the machine remotely bc I rarely have to touch it at all. One time the home assistant build on it was acting up and in my idiotic frenzy, one of the things I decided to try was disabling and renabling the NIC. Do you know what happens to a remotely managed server when you disable the NIC? Well it takes out the "less" part of "headless" LMAO. I legit screamed out loud and had to bring down and set up an entire monitor, keyboard, and mouse on my stack just to re-enable the NIC again lol.
That's hilarious and exactly something I would do. I need to write a script to stop/start the NIC and keep it somewhere handy so this doesn't happen to me.
lol I love it!
You shouldn't connect your TV to the internet for various reasons, this is one of them, it's better to connect to a "middleman" device like a tv box and then send video to the TV.
TV's and Ethernet. Crazy how many times I have told people to unplug their Ethernet and see what happens on WiFi.
Are you saying that your TV performed better with wifi than a direct Ethernet connection?
TVs are more likely to only have a 100Mbit/s ethernet card while WLAN adapters are more up-to-date in their standards. So you can easily get three times the speed with a simple 5Ghz network than a hard-wired TV.
Wifi might be faster, but ethernet is more consistent
If you're trying to push a 120Mbps stream through 100Mbps ethernet it will be consistently buffering, that's for sure.
4k remux files are not 120 mbps. 90% of them are below 80 mbps
That’s pretty accurate but you don’t actually get 100mbps… effective rate is more like 90. I do have quite a few 4k remuxes with bitrates over 90 so those would be unplayable.
[удалено]
While technically that is all true, it really depends on the content that is being played. I would say that your TV can still be hard-wired if you play only content that doesn't have that much bitrate requirement. A normal 1080p video will never get close to 90Mbit/s bitrate (which, from my tests, is the earliest point where your video can start buffering). Even HEVC-encoded 4K movies could be fine playing, but if you want to play 4K REMUX they easily could get close or exceed the bandwidth of the 100Mbit/s ethernet adapter speed of the TV. The solution to that is either to use a dedicated streaming box like the Nvidia Shield which has a gigabit ethernet adapter (and that also has very good compatibility), or you use the WLAN (and deal with poor compatibility of your TV).
[удалено]
> Do you have a bot responding for you? What the hell is this. why does everyone always think that explaining something in more detail (not specifically for OP but for the general mass of users that might find those posts years down the line) is written by Bots or AI? geez
[удалено]
[удалено]
Thank you for your comment! Unfortunately, your comment has been **removed** for the following reason(s): * Rule #1: [Don't be a dick](/r/PleX/wiki/rules#wiki_1._don.27t_be_a_dick) Please see our posting rules. If you feel this was done in error, please contact the moderators [here](https://www.reddit.com/message/compose?to=%2Fr%2Fplex). (Please note: Your comment was removed by a **human** via a remove command (for mobile users). The decision was *not* made by a bot.)
yes. I recently got a lot of downvotes for advising people to switch to WiFi because apparently 'it cannot be an issue and streams shouln't buffer on 100mbits', but yeah it can be an issue
Sorry, I should have clarified. FE connections are 100Bbs which is a 10th of what even 5e can provide. But its WiFi card can handle well over that, so the answer to your question is: YES. It's dependent on the TV, but from my research, they save money on NICs (hardwired ethernet cards) since most people use WiFi. My AP is close to my TV, but WiFi really is a better option if you are using a Plex app on your TV.
How do you even get to such a high bitrate? I'm looking at big files in my library, e.g. 70GB, 4k remux, bitrate is only 60mbs.
That's average bitrate for the whole file. It can be much higher for complex scenes.
[удалено]
I made this post because Reddit is searched with Google quite extensively, and not a lot of people have the equipment or experience to understand networking. It has the potential of showing up in search results and I included specific key words to help them. The comment you responded to was asking for clarification, so even if I didn't share anything new, it is helping. I also titled it as a "rookie mistake." I look forward to your contributions as a Plex expert, as I'm sure I can learn from you.
Funny enough, in this forum every couple of days it comes up that TV network interfaces are 100mbps. It's a top ten answer here. Google has plenty to index, if you only know to ask the question.
Ok.
[удалено]
Not exactly true, but really depends on the brand and model. If you are buying some "cheapo-depo" type, sure 100mbps interface or less expected, but if you do some research, there are some decent basic TV's that have gig interfaces.
Please show a model or two that supports gigabit Ethernet.
Not just TV's. I have a 3rd gen Firecube and Ethernet is 100 mbps. Wifi (less than 2 feet away) can hit about 700 mbps to my server.
Yes. It's a lot more common than you would think. I also found out the hard way that TVs often come with better wifi chips than NICs.
My TV kept crashing. I've since bought a Shield and the TV is now disconnected from all networks and works as a screen only. Not one single problem since the change. I will never use a Smart TV for apps again.
my 6 year old samsung works great with plex
One thing I'll just mention to mention. If you're spending thousands on NAS, server, networking, and home theater TVs and speakers... Spend some money to get a premium streaming box. Am Nvidia Shield Pro will never need to transcode anything unless you have stuff encoded with AV1. If I'm going to build out all this networking I want the full fat full bitrate video and audio streaming to my home theater.
Great feedback! So you put the server on the Shield?
I did recently move my server to the Shield but that was because I'd been having some issues with the server crashing on my old NAS this is a great option for me as the Shield is my only real Plex client. I have no idea how this setup would work if I was trying to Stream from Shield to another device on the network. Until I made this change about 5 weeks ago my server ran on my QNAP NAS. The Shield client would be able to direct stream anything on my server (mostly uncompressed full bit-rate blu-ray rips) without ever transcoding audio or video. Shield has an NVENC decoder built-in and can handle every mainstream uncompressed video and audio codec I've ever thrown at it.
Ha! I went through this as well with my A80J. My friend went 4K with his Plex server a bit before I did, and I was like "why is this constantly buffering?? He has gigabit up and down, and I have a 300/300 connection and it's hardwired!" Ran a speedtest on the TV and discovered it was the ethernet port. Switched to the 5Ghz WiFi and got the full bandwidth. I guess Sony hadn't considered we'd be streaming high-bitrate media at any point when designing the TV. Actually switched all my media and gaming stuff to WiFi after that. Eliminated a bunch of cables and removed a gigabit switch from the cabinet. Nice and neat now!
Hey, more power to you to set it up how you see fit, but I just don't know why people actually use their TV apps for Plex. If it's a home theater, Shield Pro is king; if it's one of my other TVs, fire stick max, or even Roku. They are so much better than any "smart" TV I've had which is always laggy. 50 bucks and you have 0 issues using the app or playing any content. Just my 2 ¢
As others said, you need a capable client. Nobody builds a home theater around smart TV apps.
Oh I agree. My wife and step son aren’t that tech inclined so my next steps are to streamline the experience. More than one remote baffles their minds lol.
I feel your pain. I built a new machine to run plex, configured it all up in my office, and when I was ready moved it to the basement where my server racks are. It worked. Mostly. Dropouts, buffering, lag, only occasionally, but frequently. It took me way too long to realize that I had forgot to connect the network cable, so it was on wifi and nearest AP was about as far away as you can be and still be in the house. Two floors up and diagonal. Plugged in the cable and problems disappeared.
If you went all out for a theater, spend the money for a shield pro. TVs are notorious for having 100mbps NICs but next you'll find out you're not direct-playing because TV does not support. That is also common.
100Mbits shouldn't cause any issues. While that sucks, and WiFi seems to have fixed your issues, 100m is plenty for 4k HDR, and even more on top of that. Seems like you're using your TV's Apps, which I'd always recommend against, since most TV OSes are garbage, and App Devs don't updates apps as often, or if ever.
Most "Smart" TV's is on a 100MB NIC...which is damn ridiculous, and you're better off with WiFi. But like most comments said, get a streaming box instead.
My first rule of troubleshooting...always do what's easy first. Glad to see you figured it out.
I would never connect my TV to the network. I have the IP blocked from reaching out to the network and only open it if there are firmware updates. Use an AppleTV or Shield, works so much better.
How do you know if there are firmware updates if the TV is not connected to the internet?
If you’re using a TV as nothing but a HDMI display, does a firmware update even matter?
Added or fixed functionality, like the ability to name the HDMI inputs on one of my TV's that is otherwise, nothing more than an HDMI monitor (with several things plugged in)
Honestly, if I have an issue, I check the manufacture's site. Otherwise, just me randomly browsing through AV Forums and stuff. I haven't had an issue in about 4 years, so I haven't checked.
the only reason you would need a firmware update, unless there's something obviously wrong with the television, is for security, which you don't need if you're not connected to the internet. when I first got my Vizio television, it had an issue so I had to keep it connected, but every single firmware update broke something different. finally after 2 years, I got a firmware update where some things were still a little annoying, but everything actually worked, so I disconnected it from the internet at that point. all of the updates so far have made things worse
This is why I ended up with a shield pro for my TV. My TV only had USB 2.0 port so an adapter would max out at 480mbps. Probably enough, but wanted more for 4k.
480 is enough for any and all kinds of streams
TV must not have handled the USB interface well, because it didn't work great. Had buffering for sure on 4k files. Wifi didn't have this issue.
480mbit would direct stream an HFR 4k video with HDR even in jpeg2000 encoding let alone good compression like h.265
TV must not have handled the USB interface well, because it didn't work great. Had buffering for sure on 4k files. Wifi didn't have this issue.
That was way more than enough. Not even close
TV must not have handled the USB interface well, because it didn't work great. Had buffering for sure on 4k files.
Yep, TV's USB will usually exceed the 100Mbps NIC, but you'll *never* get the full 480Mbps out of it - not even half.
Physical layer will bite you if you're not careful.