Changing Video Mode causes Failure

For discussion of topics specific to MythTV on linux
kbocek
Senior
Posts: 201
Joined: Mon Jul 20, 2015 4:42 pm
United States of America

Re: Changing Video Mode causes Failure

Post by kbocek »

As I mentioned above, I mis-titled this thread. The video mode isn't changing. The option for "separate playback and menu modes" is *not* selected. I verified this at one point, playing some low resolution videos and confirming in the TV that the output was 1080p/60.

I sure hope it's something in Myth. It was starting to feel like a hardware problem. I have two other frontends both with Intel video and they work great.

My monitor section in xorg.conf looks like:

Code: Select all

Section "Monitor"
        Identifier   "Samsung50"
        VendorName   "Samsung"
        ModelName    "Samsung50"
        Option   "DPMS" "False"
        Modeline "1920x1080-1"  148.50  1920 2008 2052 2200  1080 1084 1089 1125 +hsync +vsync
        Modeline "1920x1080i-1"   74.25  1920 2448 2492 2640  1080 1084 1094 1125 interlace +hsync +vsync
        Modeline "1920x1080i-2"   74.25  1920 2008 2052 2200  1080 1084 1094 1125 interlace +hsync +vsync
        Modeline "1920x1080-2"  148.50  1920 2448 2492 2640  1080 1084 1089 1125 +hsync +vsync
        Modeline "1920x1080-3"   74.25  1920 2558 2602 2750  1080 1084 1089 1125 +hsync +vsync
        Modeline "1920x1080-4"   74.25  1920 2448 2492 2640  1080 1084 1089 1125 +hsync +vsync
        Modeline "1920x1080-5"   74.25  1920 2008 2052 2200  1080 1084 1089 1125 +hsync +vsync
EndSection
The modelines were extracted from the X autoprobe output. On my other frontends, X will add the manual lines to this list of autoprobed modelines in Xorg.0.log. Somehow here my created "-1", "-2" identifiers are not showing up so I can't select them in my screen section.

Not sure what to do now.
blm-ubunet
Senior
Posts: 265
Joined: Sun Jun 15, 2014 1:08 am
Cambodia

Re: Changing Video Mode causes Failure

Post by blm-ubunet »

I calculated the refresh rates for each of your listed 7 modes:
60 p
25 i
30 i
50 p
24 p // should be 23.976 additional
25 p ?
30 p ?
All above numbers are exact integer values (to 3dp).
Can't comment on the validity of the sync polarities.
They appear to be mix of std computer modes but not exact video modelines.. did they come out of a TV EDID ?
Maybe some are rejected because they are parsed and seen as repeats & plain wrong &/or outside acceptable V & H sync bands.
Can't see any use for 25p & 30p.. 24p (23.976) is bad enough.

Should never need to output interlaced video, make video card do it. If it can't then replace with something fit for purpose.
And should always use reduced vertical blanking timing if possible.


So replace all of them with the output from this:
cvt -r 1920 1080 60 // 1920x1080 59.93 Hz right TV/video mode.
&
cvt 1920 1080 50 // 50p no RVB

cvt -i 1920 1080 30 // if you really must have interlaced. not exact refresh rate (modify -r 60 modeline?).

Sadly, the FOSS modeline calculators do not understand the reduced blanking timing & exact video timings for 24 & 50Hz or any interlaced mode.
But all modern displays work with reduced blanking timing.
kbocek
Senior
Posts: 201
Joined: Mon Jul 20, 2015 4:42 pm
United States of America

Re: Changing Video Mode causes Failure

Post by kbocek »

So the calculated modeline is:

Code: Select all

Modeline "1920x1080R"  138.50  1920 1968 2000 2080  1080 1083 1088 1111 +hsync -vsync
I added this to my monitor section and commented out the others. However after rebooting I do not see "1920x1080R" anywhere in Xorg.0.log. Any thoughts on why?

I wasn't expecting to use an interlaced mode. I just wanted to try everything and see if it would fix the problem.
blm-ubunet
Senior
Posts: 265
Joined: Sun Jun 15, 2014 1:08 am
Cambodia

Re: Changing Video Mode causes Failure

Post by blm-ubunet »

I would guess that the whole file is being ignored due to syntax erorr or missing semantics.
The log file should indicate that your xorg.conf is be opened.

The documentation on xorg.conf is very dated except for Nvidia & that most likely does not apply.
I use custom modelines but on Nvidia hardware.

Attached an example of Nvidia xorg.conf file (/etc/X11/xorg.conf.d/20-nvidia.conf)

Code: Select all

# 20-nvidia.conf
# nvidia-settings: X configuration file generated by nvidia-settings

Section "Modes"
    Identifier         "myHDTV"

    # 3840x2160 23.99 Hz (CVT) hsync: 52.59 kHz; pclk: 266.75 MHz
    Modeline "3840x2160_24.00"  266.75  3840 4056 4456 5072  2160 2163 2168 2192 -hsync +vsync

    # perfect 50Hz mode, reduced vertical blanking period
    # 3840x2160x50.000 @ 110.500kHz
    Modeline "3840x2160_50.00"  442.000  3840 3888 3920 4000  2160 2163 2167 2210  +HSync -VSync

    #3840x2160 p59.940  rvb
    Modeline "3840x2160_59.94"  532.75  3840 3888 3920 4000  2160 2163 2167 2222 +hsync -vsync

    Modeline "3840x2160_75.00"  904.00  3840 4168 4592 5344  2160 2163 2168 2257 -hsync +vsync
EndSection

Section "Monitor"

    # HorizSync source: edid, VertRefresh source: edid
    Identifier     "Monitor0"
    VendorName     "Unknown"
    ModelName      "Philips PHL BDM4350"
    HorizSync       30.0 - 160.0
    VertRefresh     23.0 - 80.0
    DisplaySize     953   543
    Option         "DPMS"
    UseModes       "myHDTV"
    Option         "PreferredMode" "3840x2160_50.00"
EndSection

Section "Device"
    Identifier     "Device0"
    Driver         "nvidia"
    VendorName     "NVIDIA Corporation"
    BoardName      "GeForce GTX 1050 Ti"
#    Option         "ForceFullCompositionPipeline" "DFP-2"
EndSection

Section "Screen"
    Identifier     "Screen0"
    Device         "Device0"
    Monitor        "Monitor0"
    DefaultDepth    24

    Option         "Xinerama" "0"
    Option         "TwinView" "0"
    Option         "DefaultServerLayout" "Layout0"

    Option         "NoLogo"     "True"
    Option         "ModeDebug"  "True"
    Option         "SLI"        "Off"
    Option         "MultiGPU"   "Off"
    Option         "BaseMosaic" "Off"

    Option         "ConnectedMonitor"  "DFP-2"
    Option         "UseDisplayDevice"  "DFP-2"
    Option         "ConstantDPI"       "False"

    Option "UseEDID"        "False"
    Option "ModeValidation" "DFP-2: AllowNonEdidModes, NoEdidModes, NoVesaModes, NoXServerModes, NoPredefinedModes"

    Option         "ColorSpace"        "DFP-2: YCbCr444"
    Option         "Stereo" "0"
    Option         "nvidiaXineramaInfoOrder" "DFP-2"
    SubSection     "Display"
        Depth        24
        Modes        "3840x2160_50.00 ; 3840x2160_59.94 ; 3840x2160_75.00 ; 3840x2160_24"
        Option       "PreferredMode" "3840x2160_50.00"
    EndSubSection
EndSection
kbocek
Senior
Posts: 201
Joined: Mon Jul 20, 2015 4:42 pm
United States of America

Re: Changing Video Mode causes Failure

Post by kbocek »

No it's being read. My monitor identifier "Samsung50" as well as the specified Intel driver settings are present in Xorg.0.log. Just not the custom mode lines.
blm-ubunet
Senior
Posts: 265
Joined: Sun Jun 15, 2014 1:08 am
Cambodia

Re: Changing Video Mode causes Failure

Post by blm-ubunet »

Any clues here ?:
https://01.org/linuxgraphics/documentat ... -randr-1.2

Sadly, that site has devolved into that modern mobile gloss/shiny look with less content just like AMD website.

~$ man xorg.conf
kbocek
Senior
Posts: 201
Joined: Mon Jul 20, 2015 4:42 pm
United States of America

Re: Changing Video Mode causes Failure

Post by kbocek »

Yes I played with xrandr. When the screen went blank, I found I could bring it "alive" again by switching to another resolution. Sadly, switching back to 1080p/60 *still* didn't work.
Post Reply