GLX errors with Intel graphics + an nVIDIA card used for computeDownsampling on Nvidia Optimus (with Intel HD...

Professor forcing me to attend a conference, I can't afford even with 50% funding

Has a sovereign Communist government ever run, and conceded loss, on a fair election?

Can inspiration allow the Rogue to make a Sneak Attack?

How do you make a gun that shoots melee weapons and/or swords?

Is it a Cyclops number? "Nobody" knows!

How to educate team mate to take screenshots for bugs with out unwanted stuff

Use Mercury as quenching liquid for swords?

What does "rhumatis" mean?

Boss Telling direct supervisor I snitched

3.5% Interest Student Loan or use all of my savings on Tuition?

How do you use environments that have the same name within a single latex document?

Can the Witch Sight warlock invocation see through the Mirror Image spell?

How can I have x-axis ticks that show ticks scaled in powers of ten?

Ultrafilters as a double dual

Why does this boat have a landing pad? (SpaceX's GO Searcher) Any plans for propulsive capsule landings?

Why is there an extra space when I type "ls" on the Desktop?

What does *dead* mean in *What do you mean, dead?*?

PTIJ: Sport in the Torah

Can I negotiate a patent idea for a raise, under French law?

What is the orbit and expected lifetime of Crew Dragon trunk?

“I had a flat in the centre of town, but I didn’t like living there, so …”

Why isn't P and P/poly trivially the same?

Is divide-by-zero a security vulnerability?

What would be the most expensive material to an intergalactic society?



GLX errors with Intel graphics + an nVIDIA card used for compute


Downsampling on Nvidia Optimus (with Intel HD Graphics 3000)4k HDMI output with Intel HD graphics/NvidiaMulti GPU for 3rd monitor - linux mint - geforce 750tiNVidia Graphics Card Error 43?Linux and Nvidia Hybrid Graphics cardNvidia + Integratid Intel + Nouveau + Ubuntu: Nvidia is not being usedBumblebee on Centos 7.2Using an Nvidia card instead of Intel graphics on FedoraDisable nvidia graphics card and force use intel HD graphicsI have a NVIDIA GeForce graphics card as well as an integrated Intel HD graphics card













0















I've been having a problem across multiple distributions (Devuan ASCII, Mint 18.3, Ubuntu 16.04) and several years, which has to do with my atypical hardware setup: I use my on-board graphics (Intel Z170 board) to feed my monitor, but have an nVIDIA GPU which I use for compute - so I have the nVIDIA driver up.



Now, my problems all stem from (or go through) some kind of OpenGL-driver-related mixup. So, my X startup log says:



[    16.296] (II) AIGLX: Screen 0 is not DRI2 capable
[ 16.296] (EE) AIGLX: reverting to software rendering
[ 16.344] (EE) AIGLX error: dlopen of /usr/lib/x86_64-linux-gnu/dri/swrast_dri.so failed (/usr/lib/x86_64-linux-gnu/dri/swrast_dri.so: undefined symbol: _glapi_tls_Dispatch)
[ 16.344] (EE) GLX: could not load software renderer
[ 16.344] (II) GLX: no usable GL providers found for screen 0


and then, a bunch of other programs/environments fail. Specifically, Cinnamon crashes and starts in Fallback mode. I filed a bug against the project, but they closed it, telling me:




Cinnamon requires working opengl/glx




which is fair enough, I suppose (although they still shouldn't crash). KDE also crashes last time I checked (although to be fair that was a long time ago.)



My question: What can I do in order to get X to start a GLX extension properly, with my hardware setup the way it is?



Following some advice on an Internet forum, I noticed I have the following files on my system:



/usr/lib/xorg/
/usr/lib/xorg/modules
/usr/lib/xorg/modules/extensions
/usr/lib/xorg/modules/extensions/libglxserver_nvidia.so
/usr/lib/xorg/modules/extensions/libglx.so
/usr/lib/xorg/modules/extensions/libglxserver_nvidia.so.410.73


so, I moved them away to see what happens. Now I get this in Xorg.0.log:



[ 26359.170] (II) LoadModule: "glx"
[ 26359.170] (WW) Warning, couldn't open module glx
[ 26359.170] (II) UnloadModule: "glx"
[ 26359.170] (II) Unloading glx
[ 26359.170] (EE) Failed to load module "glx" (module does not exist, 0)


So, a "cleaner" error. But this doesn't solve the problem. It does allow me to ask a secondary question: Where do I get an appropriate GLX module for my system, and how do I install it?










share|improve this question





























    0















    I've been having a problem across multiple distributions (Devuan ASCII, Mint 18.3, Ubuntu 16.04) and several years, which has to do with my atypical hardware setup: I use my on-board graphics (Intel Z170 board) to feed my monitor, but have an nVIDIA GPU which I use for compute - so I have the nVIDIA driver up.



    Now, my problems all stem from (or go through) some kind of OpenGL-driver-related mixup. So, my X startup log says:



    [    16.296] (II) AIGLX: Screen 0 is not DRI2 capable
    [ 16.296] (EE) AIGLX: reverting to software rendering
    [ 16.344] (EE) AIGLX error: dlopen of /usr/lib/x86_64-linux-gnu/dri/swrast_dri.so failed (/usr/lib/x86_64-linux-gnu/dri/swrast_dri.so: undefined symbol: _glapi_tls_Dispatch)
    [ 16.344] (EE) GLX: could not load software renderer
    [ 16.344] (II) GLX: no usable GL providers found for screen 0


    and then, a bunch of other programs/environments fail. Specifically, Cinnamon crashes and starts in Fallback mode. I filed a bug against the project, but they closed it, telling me:




    Cinnamon requires working opengl/glx




    which is fair enough, I suppose (although they still shouldn't crash). KDE also crashes last time I checked (although to be fair that was a long time ago.)



    My question: What can I do in order to get X to start a GLX extension properly, with my hardware setup the way it is?



    Following some advice on an Internet forum, I noticed I have the following files on my system:



    /usr/lib/xorg/
    /usr/lib/xorg/modules
    /usr/lib/xorg/modules/extensions
    /usr/lib/xorg/modules/extensions/libglxserver_nvidia.so
    /usr/lib/xorg/modules/extensions/libglx.so
    /usr/lib/xorg/modules/extensions/libglxserver_nvidia.so.410.73


    so, I moved them away to see what happens. Now I get this in Xorg.0.log:



    [ 26359.170] (II) LoadModule: "glx"
    [ 26359.170] (WW) Warning, couldn't open module glx
    [ 26359.170] (II) UnloadModule: "glx"
    [ 26359.170] (II) Unloading glx
    [ 26359.170] (EE) Failed to load module "glx" (module does not exist, 0)


    So, a "cleaner" error. But this doesn't solve the problem. It does allow me to ask a secondary question: Where do I get an appropriate GLX module for my system, and how do I install it?










    share|improve this question



























      0












      0








      0








      I've been having a problem across multiple distributions (Devuan ASCII, Mint 18.3, Ubuntu 16.04) and several years, which has to do with my atypical hardware setup: I use my on-board graphics (Intel Z170 board) to feed my monitor, but have an nVIDIA GPU which I use for compute - so I have the nVIDIA driver up.



      Now, my problems all stem from (or go through) some kind of OpenGL-driver-related mixup. So, my X startup log says:



      [    16.296] (II) AIGLX: Screen 0 is not DRI2 capable
      [ 16.296] (EE) AIGLX: reverting to software rendering
      [ 16.344] (EE) AIGLX error: dlopen of /usr/lib/x86_64-linux-gnu/dri/swrast_dri.so failed (/usr/lib/x86_64-linux-gnu/dri/swrast_dri.so: undefined symbol: _glapi_tls_Dispatch)
      [ 16.344] (EE) GLX: could not load software renderer
      [ 16.344] (II) GLX: no usable GL providers found for screen 0


      and then, a bunch of other programs/environments fail. Specifically, Cinnamon crashes and starts in Fallback mode. I filed a bug against the project, but they closed it, telling me:




      Cinnamon requires working opengl/glx




      which is fair enough, I suppose (although they still shouldn't crash). KDE also crashes last time I checked (although to be fair that was a long time ago.)



      My question: What can I do in order to get X to start a GLX extension properly, with my hardware setup the way it is?



      Following some advice on an Internet forum, I noticed I have the following files on my system:



      /usr/lib/xorg/
      /usr/lib/xorg/modules
      /usr/lib/xorg/modules/extensions
      /usr/lib/xorg/modules/extensions/libglxserver_nvidia.so
      /usr/lib/xorg/modules/extensions/libglx.so
      /usr/lib/xorg/modules/extensions/libglxserver_nvidia.so.410.73


      so, I moved them away to see what happens. Now I get this in Xorg.0.log:



      [ 26359.170] (II) LoadModule: "glx"
      [ 26359.170] (WW) Warning, couldn't open module glx
      [ 26359.170] (II) UnloadModule: "glx"
      [ 26359.170] (II) Unloading glx
      [ 26359.170] (EE) Failed to load module "glx" (module does not exist, 0)


      So, a "cleaner" error. But this doesn't solve the problem. It does allow me to ask a secondary question: Where do I get an appropriate GLX module for my system, and how do I install it?










      share|improve this question
















      I've been having a problem across multiple distributions (Devuan ASCII, Mint 18.3, Ubuntu 16.04) and several years, which has to do with my atypical hardware setup: I use my on-board graphics (Intel Z170 board) to feed my monitor, but have an nVIDIA GPU which I use for compute - so I have the nVIDIA driver up.



      Now, my problems all stem from (or go through) some kind of OpenGL-driver-related mixup. So, my X startup log says:



      [    16.296] (II) AIGLX: Screen 0 is not DRI2 capable
      [ 16.296] (EE) AIGLX: reverting to software rendering
      [ 16.344] (EE) AIGLX error: dlopen of /usr/lib/x86_64-linux-gnu/dri/swrast_dri.so failed (/usr/lib/x86_64-linux-gnu/dri/swrast_dri.so: undefined symbol: _glapi_tls_Dispatch)
      [ 16.344] (EE) GLX: could not load software renderer
      [ 16.344] (II) GLX: no usable GL providers found for screen 0


      and then, a bunch of other programs/environments fail. Specifically, Cinnamon crashes and starts in Fallback mode. I filed a bug against the project, but they closed it, telling me:




      Cinnamon requires working opengl/glx




      which is fair enough, I suppose (although they still shouldn't crash). KDE also crashes last time I checked (although to be fair that was a long time ago.)



      My question: What can I do in order to get X to start a GLX extension properly, with my hardware setup the way it is?



      Following some advice on an Internet forum, I noticed I have the following files on my system:



      /usr/lib/xorg/
      /usr/lib/xorg/modules
      /usr/lib/xorg/modules/extensions
      /usr/lib/xorg/modules/extensions/libglxserver_nvidia.so
      /usr/lib/xorg/modules/extensions/libglx.so
      /usr/lib/xorg/modules/extensions/libglxserver_nvidia.so.410.73


      so, I moved them away to see what happens. Now I get this in Xorg.0.log:



      [ 26359.170] (II) LoadModule: "glx"
      [ 26359.170] (WW) Warning, couldn't open module glx
      [ 26359.170] (II) UnloadModule: "glx"
      [ 26359.170] (II) Unloading glx
      [ 26359.170] (EE) Failed to load module "glx" (module does not exist, 0)


      So, a "cleaner" error. But this doesn't solve the problem. It does allow me to ask a secondary question: Where do I get an appropriate GLX module for my system, and how do I install it?







      drivers xorg nvidia-graphics-card intel-graphics opengl






      share|improve this question















      share|improve this question













      share|improve this question




      share|improve this question








      edited 9 hours ago







      einpoklum

















      asked 9 hours ago









      einpoklumeinpoklum

      2,04172969




      2,04172969






















          0






          active

          oldest

          votes











          Your Answer








          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "3"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














          draft saved

          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fsuperuser.com%2fquestions%2f1412536%2fglx-errors-with-intel-graphics-an-nvidia-card-used-for-compute%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          0






          active

          oldest

          votes








          0






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes
















          draft saved

          draft discarded




















































          Thanks for contributing an answer to Super User!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fsuperuser.com%2fquestions%2f1412536%2fglx-errors-with-intel-graphics-an-nvidia-card-used-for-compute%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Couldn't open a raw socket. Error: Permission denied (13) (nmap)Is it possible to run networking commands...

          VNC viewer RFB protocol error: bad desktop size 0x0I Cannot Type the Key 'd' (lowercase) in VNC Viewer...

          Why not use the yoke to control yaw, as well as pitch and roll? Announcing the arrival of...