Is 60Hz Better Than 30Hz?

Do you want advice about what hardware to buy for use with MythTV? Ask here.

Moderator: Forum Moderators

Post Reply
jamoody
Junior
Posts: 85
Joined: Mon Aug 18, 2014 4:19 pm
United States of America

Is 60Hz Better Than 30Hz?

Post by jamoody »

I bought a SUHD (4K) TV (Samsung UN49KS8000FXZA) a few months back and have been unhappy with the picture quality during playback. I've been working with both TV and MythTV settings and have gotten it better but only up to fair, and certainly not as good as I think it ought to be. Figures (people, etc) are fuzzy around the edges and not a clean edge and playback is a bit jerky (especially noticeable on news scrolls). I'm using VDPAU High Quality settings for 1080i and OpenGL High Quality settings for 720p. I've recently discovered that I'm connecting via 30Hz and thus can't use the 2x deinterlacers.

My current video card is GT 730 (EVGA 02G-P3-3733-KR) which apparently has HDMI 1.4 with a maximum connection of 30Hz. I'm thinking of going to GTX 1050 (EVGA 02G-P4-6152-KR) which has HDMI 2.0 and can get to 60Hz. Is this upgrade worthwhile or overkill for 1080i/720p? It seems like the 30Hz connection could very well explain the jerky scroll but I'm less confident about it explaining the fuzzy figure edges. I don't play games at all nor use Blueray or other DVD players. This frontend is strictly dedicated to MythTV 1080i/720p broadcast playback.

I'm running MythTV 0.28-76 on Ubuntu 14.04.5 with Core 2 Duo E8400 running 3.0GHz. During OpenGL playback the processors sit at about 30% so I'm not CPU bound.
daraden
Senior
Posts: 175
Joined: Tue Feb 23, 2016 7:33 am
United States of America

Re: Is 60Hz Better Than 30Hz?

Post by daraden »

60hz will help smooth out fast moving and 60fps content. it may help with some of the noise around the edges and being able to use higher quality deinterlacing should also help.
Is it more noticeable on 1080i or 720p?
Why VDPAU on 1080i and Opengl on 720p?
Have you tried changing your display settings to 1920x1080@60hz to force the tv to do the scaling?

The GTX 1050 seems like overkill for just video playback, but it and the rx 460 are the cheapest options at the moment to get hdmi 2.0. Long term it might not be so bad to have the HEVC decoder and enough power to handle 4k playback. Since your cpu will not likely handle it very well. With the 1050 you get HDMI 2.0b so you also get HDR support. Right now you may not find any of that useful, with ATSC 3.0 not too far off it could save money later.
You could try to find a cheaper card with DisplayPort 1.2 and use a Display Port to HDMI 2.0 adapter. You would lose CEC though.
Post Reply