A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » Video Cards » Nvidia Videocards
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

GeForce6200 configuration



 
 
Thread Tools Display Modes
  #1  
Old April 28th 06, 12:41 AM posted to alt.comp.periphs.videocards.nvidia
external usenet poster
 
Posts: n/a
Default GeForce6200 configuration

There's really not a problem here but i installed the latest nvidia driver
(8756) which went just fine. I ran nvidia-xconfig - ok too. Jump into X -
runs good, looks good, seems snappy. but when i run glxgears i get only
about 180 fps. Whats it take to get that up around the 2700 fps others
report?
My system is GeForce 6200 LE PCIexp card w/256meg,
x86 P4-3.4Ghz, 2Gig DDR2-533 RAM, SATA HDD Intel 945 chipset.
Screen res: 1280x1024 16bit color.
I'm running linux, mandriva 2006, system load is nil.
Any thoughts?
Thanks
Eric
ps: i know glxgears isnt a good measure but its numbers ought to be much
higher than what i get.

Below is some data copied from
xorg.conf, glxinfo and Xorg.0.log

Here's part of my xorg.conf
Section "Module"
Load "dbe" # Double-Buffering Extension
Load "v4l" # Video for Linux
Load "extmod"
Load "type1"
Load "freetype"
Load "/usr/X11R6/lib/modules/extensions/libglx.so"
Load "glx"
EndSection

And a snipet from out put of glxinfo

name of display: :0.0
display: :0 screen: 0
direct rendering: Yes
server glx vendor string: NVIDIA Corporation
server glx version string: 1.4
server glx extensions:
GLX_EXT_visual_info, GLX_EXT_visual_rating, GLX_SGIX_fbconfig,
GLX_SGIX_pbuffer, GLX_SGI_video_sync, GLX_SGI_swap_control,
GLX_ARB_multisample, GLX_NV_float_buffer, GLX_ARB_fbconfig_float
client glx vendor string: NVIDIA Corporation
client glx version string: 1.4
client glx extensions:
GLX_ARB_get_proc_address, GLX_ARB_multisample, GLX_EXT_visual_info,
GLX_EXT_visual_rating, GLX_EXT_import_context, GLX_SGI_video_sync,
GLX_NV_swap_group, GLX_NV_video_out, GLX_SGIX_fbconfig,
GLX_SGIX_pbuffer,
GLX_SGI_swap_control, GLX_NV_float_buffer, GLX_ARB_fbconfig_float
GLX extensions:
GLX_EXT_visual_info, GLX_EXT_visual_rating, GLX_SGIX_fbconfig,
GLX_SGIX_pbuffer, GLX_SGI_video_sync, GLX_SGI_swap_control,
GLX_ARB_multisample, GLX_NV_float_buffer, GLX_ARB_fbconfig_float,
GLX_ARB_get_proc_address
OpenGL vendor string: NVIDIA Corporation
OpenGL renderer string: GeForce 6200 LE/PCI/SSE2
OpenGL version string: 2.0.2 NVIDIA 87.56

Also, xorg.0.conf looks like no errors there.
here's a snippet:
name of display: :0.0
display: :0 screen: 0
direct rendering: Yes
server glx vendor string: NVIDIA Corporation
server glx version string: 1.4
server glx extensions:
GLX_EXT_visual_info, GLX_EXT_visual_rating, GLX_SGIX_fbconfig,
GLX_SGIX_pbuffer, GLX_SGI_video_sync, GLX_SGI_swap_control,
GLX_ARB_multisample, GLX_NV_float_buffer, GLX_ARB_fbconfig_float
client glx vendor string: NVIDIA Corporation
client glx version string: 1.4
client glx extensions:
GLX_ARB_get_proc_address, GLX_ARB_multisample, GLX_EXT_visual_info,
GLX_EXT_visual_rating, GLX_EXT_import_context, GLX_SGI_video_sync,
GLX_NV_swap_group, GLX_NV_video_out, GLX_SGIX_fbconfig,
GLX_SGIX_pbuffer,
GLX_SGI_swap_control, GLX_NV_float_buffer, GLX_ARB_fbconfig_float
GLX extensions:
GLX_EXT_visual_info, GLX_EXT_visual_rating, GLX_SGIX_fbconfig,
GLX_SGIX_pbuffer, GLX_SGI_video_sync, GLX_SGI_swap_control,
GLX_ARB_multisample, GLX_NV_float_buffer, GLX_ARB_fbconfig_float,
GLX_ARB_get_proc_address
OpenGL vendor string: NVIDIA Corporation
OpenGL renderer string: GeForce 6200 LE/PCI/SSE2
OpenGL version string: 2.0.2 NVIDIA 87.56
(II) resource ranges after xf86ClaimFixedResources() call:
[snip ranges and crap]
then:

(II) Setting vga for screen 0.
(**) NVIDIA(0): Depth 16, (--) framebuffer bpp 16
(==) NVIDIA(0): RGB weight 565
(==) NVIDIA(0): Default visual is TrueColor
(==) NVIDIA(0): Using gamma correction (1.0, 1.0, 1.0)
(**) NVIDIA(0): Enabling RENDER acceleration
(II) NVIDIA(0): NVIDIA GPU GeForce 6200 LE at PCI:1:0:0
(--) NVIDIA(0): VideoRAM: 131072 kBytes
(--) NVIDIA(0): VideoBIOS: 05.44.02.45.00
(II) NVIDIA(0): Detected PCI Express Link width: 16X
(--) NVIDIA(0): Interlaced video modes are supported on this GPU
(--) NVIDIA(0): Connected display device(s) on GeForce 6200 LE at PCI:1:0:0:
(--) NVIDIA(0): HP D8912 (CRT-0)
(--) NVIDIA(0): HP D8912 (CRT-0): 400 MHz maximum pixel clock
(II) NVIDIA(0): Assigned Display Device: CRT-0
(WW) NVIDIA(0):
(WW) NVIDIA(0): No modes were requested; the default mode
"nvidia-auto-select"
(WW) NVIDIA(0): will be used as the requested mode.
(WW) NVIDIA(0):
(II) NVIDIA(0): Validated modes:
(II) NVIDIA(0): "nvidia-auto-select"
(**) NVIDIA(0): Virtual screen size configured to be 1280 x 1024
(--) NVIDIA(0): DPI set to (90, 96); computed from "UseEdidDpi" X config
option
(II) do I need RAC? No, I don't.
(II) resource ranges after preInit:

[snip more resource ranges]
then:
(II) NVIDIA(0): Setting mode "nvidia-auto-select"
(II) Loading extension NV-GLX
(II) NVIDIA(0): NVIDIA 3D Acceleration Architecture Initialized
(II) NVIDIA(0): Using the NVIDIA 2D acceleration architecture
(==) NVIDIA(0): Backing store disabled
(==) NVIDIA(0): Silken mouse enabled
(**) Option "dpms"
(**) NVIDIA(0): DPMS enabled
(II) Loading extension NV-CONTROL
(==) RandR enabled
(II) Initializing built-in extension MIT-SHM
(II) Initializing built-in extension XInputExtension
(II) Initializing built-in extension XTEST
(II) Initializing built-in extension XKEYBOARD
(II) Initializing built-in extension LBX
(II) Initializing built-in extension XC-APPGROUP
(II) Initializing built-in extension SECURITY
(II) Initializing built-in extension XINERAMA
(II) Initializing built-in extension XFIXES
(II) Initializing built-in extension XFree86-Bigfont
(II) Initializing built-in extension RENDER
(II) Initializing built-in extension RANDR
(II) Initializing built-in extension COMPOSITE
(II) Initializing built-in extension DAMAGE
(II) Initializing built-in extension XEVIE
(II) Initializing extension GLX
(**) Option "CoreKeyboard"
(**) Keyboard1: Core Keyboard
(**) Option "Protocol" "standard"
(**) Keyboard1: Protocol: standard
(**) Option "AutoRepeat" "500 30"
(**) Option "XkbRules" "xorg"
(**) Keyboard1: XkbRules: "xorg"
(**) Keyboard1: XkbModel: "pc105"
(**) Option "XkbLayout" "us"
(**) Keyboard1: XkbLayout: "us"
(**) Option "XkbOptions" "compose:rwin"
(**) Keyboard1: XkbOptions: "compose:rwin"
(**) Option "CustomKeycodes" "off"
(**) Keyboard1: CustomKeycodes disabled
(**) Option "Protocol" "ExplorerPS/2"
(**) Mouse1: Device: "/dev/mouse"
(**) Mouse1: Protocol: "ExplorerPS/2"
(**) Option "CorePointer"
(**) Mouse1: Core Pointer
(**) Option "Device" "/dev/mouse"
(==) Mouse1: Emulate3Buttons, Emulate3Timeout: 50
(**) Option "ZAxisMapping" "6 7"
(**) Mouse1: ZAxisMapping: buttons 6 and 7
(**) Mouse1: Buttons: 11
(II) XINPUT: Adding extended input device "Mouse1" (type: MOUSE)
(II) XINPUT: Adding extended input device "Keyboard1" (type: KEYBOARD)
(II) XINPUT: Adding extended input device "NVIDIA Event Handler" (type:
Other)
(II) Mouse1: ps2EnableDataReporting: succeeded
SetClientVersion: 0 9
(II) Open ACPI successful (/var/run/acpid.socket)
(II) NVIDIA(0): Setting mode "nvidia-auto-select"
(II) Mouse1: ps2EnableDataReporting: succeeded
(II) Open ACPI successful (/var/run/acpid.socket)
(II) NVIDIA(0): Setting mode "nvidia-auto-select"
(II) Mouse1: ps2EnableDataReporting: succeeded
SetClientVersion: 0 9
SetGrabKeysState - disabled
SetGrabKeysState - enabled
(II) Open ACPI successful (/var/run/acpid.socket)
(II) NVIDIA(0): Setting mode "nvidia-auto-select"
(II) Mouse1: ps2EnableDataReporting: succeeded
(II) Open ACPI successful (/var/run/acpid.socket)
(II) NVIDIA(0): Setting mode "nvidia-auto-select"
(II) Mouse1: ps2EnableDataReporting: succeeded
(II) 3rd Button detected: disabling emulate3Button



  #2  
Old April 29th 06, 10:48 PM posted to alt.comp.periphs.videocards.nvidia
external usenet poster
 
Posts: n/a
Default GeForce6200 configuration

Eric wrote:
There's really not a problem here but i installed the latest nvidia driver
(8756) which went just fine. I ran nvidia-xconfig - ok too. Jump into X -
runs good, looks good, seems snappy. but when i run glxgears i get only
about 180 fps. Whats it take to get that up around the 2700 fps others
report?
My system is GeForce 6200 LE PCIexp card w/256meg,
x86 P4-3.4Ghz, 2Gig DDR2-533 RAM, SATA HDD Intel 945 chipset.
Screen res: 1280x1024 16bit color.
I'm running linux, mandriva 2006, system load is nil.
Any thoughts?
Thanks
Eric
ps: i know glxgears isnt a good measure but its numbers ought to be much
higher than what i get.

Below is some data copied from
xorg.conf, glxinfo and Xorg.0.log

Here's part of my xorg.conf
Section "Module"
Load "dbe" # Double-Buffering Extension
Load "v4l" # Video for Linux
Load "extmod"
Load "type1"
Load "freetype"
Load "/usr/X11R6/lib/modules/extensions/libglx.so"
Load "glx"
EndSection

And a snipet from out put of glxinfo

name of display: :0.0
display: :0 screen: 0
direct rendering: Yes
server glx vendor string: NVIDIA Corporation
server glx version string: 1.4
server glx extensions:
GLX_EXT_visual_info, GLX_EXT_visual_rating, GLX_SGIX_fbconfig,
GLX_SGIX_pbuffer, GLX_SGI_video_sync, GLX_SGI_swap_control,
GLX_ARB_multisample, GLX_NV_float_buffer, GLX_ARB_fbconfig_float
client glx vendor string: NVIDIA Corporation
client glx version string: 1.4
client glx extensions:
GLX_ARB_get_proc_address, GLX_ARB_multisample, GLX_EXT_visual_info,
GLX_EXT_visual_rating, GLX_EXT_import_context, GLX_SGI_video_sync,
GLX_NV_swap_group, GLX_NV_video_out, GLX_SGIX_fbconfig,
GLX_SGIX_pbuffer,
GLX_SGI_swap_control, GLX_NV_float_buffer, GLX_ARB_fbconfig_float
GLX extensions:
GLX_EXT_visual_info, GLX_EXT_visual_rating, GLX_SGIX_fbconfig,
GLX_SGIX_pbuffer, GLX_SGI_video_sync, GLX_SGI_swap_control,
GLX_ARB_multisample, GLX_NV_float_buffer, GLX_ARB_fbconfig_float,
GLX_ARB_get_proc_address
OpenGL vendor string: NVIDIA Corporation
OpenGL renderer string: GeForce 6200 LE/PCI/SSE2
OpenGL version string: 2.0.2 NVIDIA 87.56

Also, xorg.0.conf looks like no errors there.
here's a snippet:
name of display: :0.0
display: :0 screen: 0
direct rendering: Yes
server glx vendor string: NVIDIA Corporation
server glx version string: 1.4
server glx extensions:
GLX_EXT_visual_info, GLX_EXT_visual_rating, GLX_SGIX_fbconfig,
GLX_SGIX_pbuffer, GLX_SGI_video_sync, GLX_SGI_swap_control,
GLX_ARB_multisample, GLX_NV_float_buffer, GLX_ARB_fbconfig_float
client glx vendor string: NVIDIA Corporation
client glx version string: 1.4
client glx extensions:
GLX_ARB_get_proc_address, GLX_ARB_multisample, GLX_EXT_visual_info,
GLX_EXT_visual_rating, GLX_EXT_import_context, GLX_SGI_video_sync,
GLX_NV_swap_group, GLX_NV_video_out, GLX_SGIX_fbconfig,
GLX_SGIX_pbuffer,
GLX_SGI_swap_control, GLX_NV_float_buffer, GLX_ARB_fbconfig_float
GLX extensions:
GLX_EXT_visual_info, GLX_EXT_visual_rating, GLX_SGIX_fbconfig,
GLX_SGIX_pbuffer, GLX_SGI_video_sync, GLX_SGI_swap_control,
GLX_ARB_multisample, GLX_NV_float_buffer, GLX_ARB_fbconfig_float,
GLX_ARB_get_proc_address
OpenGL vendor string: NVIDIA Corporation
OpenGL renderer string: GeForce 6200 LE/PCI/SSE2
OpenGL version string: 2.0.2 NVIDIA 87.56
(II) resource ranges after xf86ClaimFixedResources() call:
[snip ranges and crap]
then:

(II) Setting vga for screen 0.
(**) NVIDIA(0): Depth 16, (--) framebuffer bpp 16
(==) NVIDIA(0): RGB weight 565
(==) NVIDIA(0): Default visual is TrueColor
(==) NVIDIA(0): Using gamma correction (1.0, 1.0, 1.0)
(**) NVIDIA(0): Enabling RENDER acceleration
(II) NVIDIA(0): NVIDIA GPU GeForce 6200 LE at PCI:1:0:0
(--) NVIDIA(0): VideoRAM: 131072 kBytes
(--) NVIDIA(0): VideoBIOS: 05.44.02.45.00
(II) NVIDIA(0): Detected PCI Express Link width: 16X
(--) NVIDIA(0): Interlaced video modes are supported on this GPU
(--) NVIDIA(0): Connected display device(s) on GeForce 6200 LE at PCI:1:0:0:
(--) NVIDIA(0): HP D8912 (CRT-0)
(--) NVIDIA(0): HP D8912 (CRT-0): 400 MHz maximum pixel clock
(II) NVIDIA(0): Assigned Display Device: CRT-0
(WW) NVIDIA(0):
(WW) NVIDIA(0): No modes were requested; the default mode
"nvidia-auto-select"
(WW) NVIDIA(0): will be used as the requested mode.
(WW) NVIDIA(0):
(II) NVIDIA(0): Validated modes:
(II) NVIDIA(0): "nvidia-auto-select"
(**) NVIDIA(0): Virtual screen size configured to be 1280 x 1024
(--) NVIDIA(0): DPI set to (90, 96); computed from "UseEdidDpi" X config
option
(II) do I need RAC? No, I don't.
(II) resource ranges after preInit:

[snip more resource ranges]
then:
(II) NVIDIA(0): Setting mode "nvidia-auto-select"
(II) Loading extension NV-GLX
(II) NVIDIA(0): NVIDIA 3D Acceleration Architecture Initialized
(II) NVIDIA(0): Using the NVIDIA 2D acceleration architecture
(==) NVIDIA(0): Backing store disabled
(==) NVIDIA(0): Silken mouse enabled
(**) Option "dpms"
(**) NVIDIA(0): DPMS enabled
(II) Loading extension NV-CONTROL
(==) RandR enabled
(II) Initializing built-in extension MIT-SHM
(II) Initializing built-in extension XInputExtension
(II) Initializing built-in extension XTEST
(II) Initializing built-in extension XKEYBOARD
(II) Initializing built-in extension LBX
(II) Initializing built-in extension XC-APPGROUP
(II) Initializing built-in extension SECURITY
(II) Initializing built-in extension XINERAMA
(II) Initializing built-in extension XFIXES
(II) Initializing built-in extension XFree86-Bigfont
(II) Initializing built-in extension RENDER
(II) Initializing built-in extension RANDR
(II) Initializing built-in extension COMPOSITE
(II) Initializing built-in extension DAMAGE
(II) Initializing built-in extension XEVIE
(II) Initializing extension GLX
(**) Option "CoreKeyboard"
(**) Keyboard1: Core Keyboard
(**) Option "Protocol" "standard"
(**) Keyboard1: Protocol: standard
(**) Option "AutoRepeat" "500 30"
(**) Option "XkbRules" "xorg"
(**) Keyboard1: XkbRules: "xorg"
(**) Keyboard1: XkbModel: "pc105"
(**) Option "XkbLayout" "us"
(**) Keyboard1: XkbLayout: "us"
(**) Option "XkbOptions" "compose:rwin"
(**) Keyboard1: XkbOptions: "compose:rwin"
(**) Option "CustomKeycodes" "off"
(**) Keyboard1: CustomKeycodes disabled
(**) Option "Protocol" "ExplorerPS/2"
(**) Mouse1: Device: "/dev/mouse"
(**) Mouse1: Protocol: "ExplorerPS/2"
(**) Option "CorePointer"
(**) Mouse1: Core Pointer
(**) Option "Device" "/dev/mouse"
(==) Mouse1: Emulate3Buttons, Emulate3Timeout: 50
(**) Option "ZAxisMapping" "6 7"
(**) Mouse1: ZAxisMapping: buttons 6 and 7
(**) Mouse1: Buttons: 11
(II) XINPUT: Adding extended input device "Mouse1" (type: MOUSE)
(II) XINPUT: Adding extended input device "Keyboard1" (type: KEYBOARD)
(II) XINPUT: Adding extended input device "NVIDIA Event Handler" (type:
Other)
(II) Mouse1: ps2EnableDataReporting: succeeded
SetClientVersion: 0 9
(II) Open ACPI successful (/var/run/acpid.socket)
(II) NVIDIA(0): Setting mode "nvidia-auto-select"
(II) Mouse1: ps2EnableDataReporting: succeeded
(II) Open ACPI successful (/var/run/acpid.socket)
(II) NVIDIA(0): Setting mode "nvidia-auto-select"
(II) Mouse1: ps2EnableDataReporting: succeeded
SetClientVersion: 0 9
SetGrabKeysState - disabled
SetGrabKeysState - enabled
(II) Open ACPI successful (/var/run/acpid.socket)
(II) NVIDIA(0): Setting mode "nvidia-auto-select"
(II) Mouse1: ps2EnableDataReporting: succeeded
(II) Open ACPI successful (/var/run/acpid.socket)
(II) NVIDIA(0): Setting mode "nvidia-auto-select"
(II) Mouse1: ps2EnableDataReporting: succeeded
(II) 3rd Button detected: disabling emulate3Button



The problem may be that you are using a 16 bit visual (just a guess).
Try making your default screen 32 bits. When you installed the driver,
there should also be a sample xorg.conf file. Also, take a look at


http://download.nvidia.com/XFree86/L...ppendix-c.html


to see that all the correct libraries are really in place and being
accessed correctly (the ldd section at the end of the page). I have
never had a problem with SuSE or Redhat (including Fedora). At the
default window size, glxgears should be running thousands-of-frames per sec
(eg ~7000 with a 5700-based card).
  #3  
Old May 4th 06, 09:59 AM posted to alt.comp.periphs.videocards.nvidia
external usenet poster
 
Posts: n/a
Default GeForce6200 configuration

[snip]


The problem may be that you are using a 16 bit visual (just a guess).
Try making your default screen 32 bits. When you installed the driver,
there should also be a sample xorg.conf file. Also, take a look at



http://download.nvidia.com/XFree86/L...ppendix-c.html


to see that all the correct libraries are really in place and being
accessed correctly (the ldd section at the end of the page). I have
never had a problem with SuSE or Redhat (including Fedora). At the
default window size, glxgears should be running thousands-of-frames per
sec (eg ~7000 with a 5700-based card).


Well, I'm now on a new mobo went from 915 chipset to 945. I did a compete
reformat and install of mandriva 2006.0 from power pack 7 cd set, did all
updates and removed the 7676 nvidia driver, installed latest 8756 driver,
went though the entire web site you mentioned and i still only get 195 fps
from glxgears. The only thing i found that i question is that ldd reports
libm.so.6 = /lib/tls/libm.so.6 (0xb7da8000)
libc.so.6 = /lib/tls/libc.so.6 (0xb7c7a000)
but libm.so.6 actually is libm.2.3.5.so on my system and all instances of
libm.so.6 are just symlinks pointing to libm.2.3.5.so (same for libc)
ie: /lib/tls/libm.so.6 - libm-2.3.5.so
/lib/tls/libc.so.6 - libc-2.3.5.so

and also the ldd address values are different except for linux-gate
# ldd /usr/X11R6/bin/glxgears
linux-gate.so.1 = (0xffffe000)
libGL.so.1 = /usr/lib/libGL.so.1 (0xb7e65000)
libXp.so.6 = /usr/X11R6/lib/libXp.so.6 (0xb7e5c000)
libXext.so.6 = /usr/X11R6/lib/libXext.so.6 (0xb7e4e000)
libX11.so.6 = /usr/X11R6/lib/libX11.so.6 (0xb7d82000)
libpthread.so.0 = /lib/tls/libpthread.so.0 (0xb7d70000)
libm.so.6 = /lib/tls/libm.so.6 (0xb7d4b000)
libc.so.6 = /lib/tls/libc.so.6 (0xb7c1d000)
libGLcore.so.1 = /usr/lib/libGLcore.so.1 (0xb745b000)
libnvidia-tls.so.1 = /usr/lib/tls/libnvidia-tls.so.1 (0xb7459000)
libdl.so.2 = /lib/libdl.so.2 (0xb7455000)
/lib/ld-linux.so.2 (0xb7f03000)
the (0xb7... addresses dont match the ldd example on the url you posted. It
may not matter but i thought i'd mention it just in case.
I traced every sym link in the article and they all match the articles
settings
One anomaly:
/lib/modules/`uname -r`/kernel/drivers/video/nvidia.o
is actually a .ko file on my system and not a .o
ie:
/lib/modules/2.6.12-12mdksmp/kernel/drivers/video/nvidia.ko
I also checked the Xorg.0.log and there are no (EE)'s, no (WW)'s and it
looks like everything is loading. I can post it if it would help
I am stumped! Maybe its the card itself? its a GeForce 6200 LE rev 161
128Meg on card and using another 128meg of main memory.
Thanks,
Eric

 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Using Advanced Configuration Functions (ServeRAID) [email protected] Storage (alternative) 2 January 10th 05 10:30 AM
Using Advanced Configuration Functions (ServeRAID) [email protected] Storage (alternative) 0 January 6th 05 07:15 PM
Using Advanced Configuration Functions (ServeRAID) [email protected] Storage (alternative) 0 January 6th 05 03:19 PM
Using Advanced Configuration Functions (ServeRAID) [email protected] Storage (alternative) 0 January 6th 05 03:15 PM
Using Advanced Configuration Functions (ServeRAID) [email protected] Storage (alternative) 0 January 6th 05 03:07 PM


All times are GMT +1. The time now is 08:03 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.