Welcome to the computing reference desk.
|
Choose a topic:
See also:
|
March 24
Browser extentions/add-ons/plug-in
Are browser plug-ins/add-ons/extensions basically just like web-pages too? It's javascript + XML (or another XML-like mark-up language), + graphical elements, right?--Scicurious (talk) 01:13, 24 March 2016 (UTC)
- The answer is "it depends". Really you're going to have to look at each individual browser engine you're targeting for details. A source of confusion is that the terminology is not standardized. "Plug-in", "add-on", and "extension" are often used interchangeably in everyday speech, but they can mean different things "under the hood". To illustrate this let's look at the Mozilla ecosystem. In Mozilla Land, "add-ons" is the generic term for anything that modifies the application. "Extensions" are a subset of "add-ons". These are JavaScript and markup language. These are the things you can download from addons.mozilla.org. "Plugins" are shared libraries that get loaded into the application. These are things like the Adobe Flash plugin. But confusingly, there are also "search engine plugins" that are just XML documents. --71.110.8.102 (talk) 03:52, 24 March 2016 (UTC)
- You might find our article on Browser extensions of some help... Vespine (talk) 04:19, 24 March 2016 (UTC)
Is it possible to measure what OS is more stable?
Is it all an impressionistic measure? For example, some people have the impression that their Windows XP (of 10 years ago) was more stable than Ubuntu nowadays. But given the fact that we are doing different things, using different programs, with different amounts of data, different drivers, and in different machines, can we measure stability at all? --Llaanngg (talk) 22:52, 24 March 2016 (UTC)
- In its broadest sense, to me, the stability of any system its ability to resist or recover from any input disturbance. So, to test relative stability of a particular system over others, one must hit each system with the same disturbances and evaluate the system that is least affected. The real problem, though, is probably choosing which disturbances to use in your testing. A system that can survive common disturbances may crash with one uncommon disturbance that it has not been designed to resist.--178.111.96.35 (talk) 01:28, 25 March 2016 (UTC)
- How about 16 years of uptime?[1] Or look at Voyager 1 and 2, which were launched in 1977 and are expected to remain operational until approximately 2025. And, of course, embedded systems that never crash are fairly common; when was the last time you saw a Casio solar watch crash? Or a 5ESS switch? Other than things like botched updates or the building burning to the ground, I don't think anyone has ever seen a 5ESS crash. I have designed embedded systems that do a hard reset 60 times a second, at which point they do the job I designed them to do and then go to sleep until the next reset. The entire concept of "crashing" doesn't apply to a system like that. --Guy Macon (talk) 05:49, 25 March 2016 (UTC)
- It is possible to measure just about anything if you can define how to measure it. You must define what "stable" means. In computer science, stable tends to refer to the stability of algorithms. Even then, it can mean different things. If I say that a statistics algorithm is stable, I mean that it isn't overly influenced by outliers. If I say that a data-set algorithm is stable, I mean that it preserves the identity of each data item (eg: If I sort a series of objects, I don't change the ID of the objects that I am sorting). You appear to be referring to stable as in continuing to function. That is an engineering definition of stability. However, I've personally never seen "stability" used in computer engineering. I assume it is because it would be confusing in a field so closely related to computer science. I've seen the common mean-time-between-failures (MTBF) used. In actual operation, what is the MTBF? That, however, tends to refer to catastrophic failures. I think that you are referring to minor failures that cause a software failure, not a hardware failure. So, MTBF would be the wrong measure. Overall, I think you can stick with the common "uptime" measure. That is a measure of the percent of time that an active server can be considered to be "up". It should never be 100% because there has to be some period in time in which the operating system itself is upgraded. Even that time is shrinking fast as operating systems are gaining the ability to update while running. They are technically down to microseconds as they upgrade in place. The problem you will run into with uptime is that it takes the environment into account. What if the power in the server room fails? What if the network fails and the server is unreachable? Also, it doesn't take into account the software. What if I have a database server with 99% uptime, but the database program tends to crash a few times every day? From my point of view, the server has about an 8 hour uptime before I have to restart something. Sorry for the long-winded non-answer, but your question doesn't make it possible to give an informed correct answer and I don't want to spout out opinions or anecdotes. 209.149.114.215 (talk) 15:22, 25 March 2016 (UTC)
- I find the term "stable" to be inapt and subject to lots of abuse of terminology. Many people use the word "stable" to refer to software that is free of bugs. This seems incorrect: a better phrase would be "software that is free of bugs."
- In the dictionaries that I use, "stable" implies that there are no changes. This is an entirely orthogonal property of software from "software that is free of bugs."
- I work on a large number of software projects - including several free and non-free operating systems - that turn nightly builds. I'll frequently get an early morning phone call - is today's build "stable"?
- Well, how can it be? It changed since last night!
- Even worse - "is today's build buggy?" System software in this decade (2016) contains hundreds of billions of lines of program code. (For example, consider only Linux kernel, which is altogether a very tiny portion of only one free operating system). In any system with hundreds of billions of lines of program-code, there are probably hundreds of millions of bugs. Many of those bugs are entirely irrelevant to what you need to do, and they will have no impact on you. Most human brains do not seem designed to comprehend the abject vastness of this complexity-space.
- I find that engineers and programmers can communicate much more productively when they evict the word "stable" from their vocabulary. Software should be described as either "free of relevant bugs," or if a bug exists, it ought to be immediately ticketed so the bug may be described precisely. If you aren't sure how to describe a problem, that's fine: you can still ticket a bug-report that says, for example, "Software A intermittently experienced Problem B while I was doing Task C." Software developers may not be able to fix Problem B yet, but your report does not exist in isolation - it can help identify the statistics of systemic, difficult-to-reproduce problems. When an issue is problematic, one should say "Bug #X prevents me from doing Task Y on Software Version Z."
- Such precise descriptions are much more useful than vague statements about which software is "stable" or "buggy."
- Here's a fantastic essay by Simon Tatham, author of PuTTY: How to Report Bugs Effectively.
- Nimur (talk) 16:35, 25 March 2016 (UTC)
-
- Regarding "hundreds of billions of lines of program code.", Tiny Core Linux has a total size for the entire OS of 11MB (16MB if you want a GUI), and appears to be very crash-resistant. --Guy Macon (talk) 16:51, 25 March 2016 (UTC)
-
- You're just distinguishing between stability of code base vs. stability of operation (or something like that). That's not abuse of terminology, just good old polysemy. Of course "stable" can mean lots of different things, just look at our disambiguation page for stability. I think IP 178 has a very good answer- it gives a rather general notion of stability, and correctly points out that we'd have to standardize the perturbation/disturbance, which itself is a tricky problem. Some ideas on this are discussed at stress testing, and the article on stress testing_(software) covers some of what OP is looking for. There's at least 5 different notions of stability just for solutions to ordinary differential equations, and many others for different sorts of physical and informational concepts. That doesn't mean that I'm wrong to say that e.g. NetHack or TeX has a very stable code base. I don't think I've heard anyone use "stable" to mean "free of bugs", but I do hear stable to mean "program rarely crashes". I think this is the sense that OP means, and is the sense that is commonly used for OSs. This is related to "free of changes" or "resist/recover from disturbance", but rather than asking for a single state that doesn't change or is returned to, we must have a region of state or phase space that defines "normal operation". It is true that this is very hard to carefully codify and quantify, but that doesn't mean it doesn't exist. OP may also consider that any OS is inherently very unstable, in the sense that an OS stuck in one state or configuration would be useless. The fact that my computer does all sorts of different things when I do different things is a feature. This is all just to say that Instability is also crucial for control systems - without some instability (in a strict, technical sense of equilibria), we can't get a system to do anything. SemanticMantis (talk) 18:15, 25 March 2016 (UTC)
March 25
Webcam for ipad
Where can I buy webcams for iPads? — Preceding unsigned comment added by 75.134.16.79 (talk) 00:25, 25 March 2016 (UTC)
- If you mean an external webcam, then I am afraid there is none. At least I couldn't find any some time ago. --Llaanngg (talk) 00:48, 25 March 2016 (UTC)
March 26
Freedom251
Are they being delivered?Aryan ( है?) 04:29, 26 March 2016 (UTC)
- Our article on the Freedom 251 seems to suggest that it's scam of some kind. So...probably never. SteveBaker (talk) 04:53, 26 March 2016 (UTC)
-
- Some sources confirm this: [2] and [3]. The red flags point to a massive scam. However, at first I thought it was a kind "cheap printer, expensive printer ink cartridges" pricing strategy. After all, they could bundle other services to this smartphone like many other companies do. Indeed, they could even offer it for free, and recover the cost through the cell phone plan. --Scicurious (talk) 16:19, 26 March 2016 (UTC)
-
-
- I've gotten cell phones from TracFone during promotions for free (with minutes purchase) or for US$5. The minutes purchase can be as low as $7 a month/7 cents a minute, and included a smart phone with camera (I am amazed that Apple manages to sell smart phones for a hundred times more). The $5 phone was an older flip phone. Presumably these were phones they wanted to clear out of their inventory. So, it's not impossible to make offers like Freedom251, at least in the US, but that doesn't mean this particular offer is legit. StuRat (talk) 16:28, 26 March 2016 (UTC)
-
- I wonder if there's an relationship with the FreedomPop cell phone company. StuRat (talk) 17:02, 26 March 2016 (UTC)
-
- I couldn't find any link between these two, besides the name. I also couldn't find any indication that FreedomPop is a scam.Scicurious (talk) 17:09, 26 March 2016 (UTC)
Open Source software and actually accessing the source
If someone releases software under an Open Source license, is he also forced to take care that the source is actually accessible? What happens if he uploads it to some platform but it gets deleted, for whatever reason? Would the license still be considered an Open Source license? Has the author the obligation to take care that the source is accessible? Or is it enough when the author does to try to block access to the source?--Scicurious (talk) 16:13, 26 March 2016 (UTC)
- You should not confuse the copyright with the physical access to the source code. They are two orthogonal to each other concepts. Ruslik_Zero 19:49, 26 March 2016 (UTC)
- The author of the software, who holds the copyright, is not obliged to comply with the terms of their own license. They could choose a license that forbids redistribution of binaries without source, and then never release the source, and the effect would be the same as if they'd just forbidden redistribution. If the software incorporates third-party libraries copyrighted by others, the author is obliged to follow their licensing terms, which might require distributing source code of the entire program, not just those libraries.
- Whether a license is considered an open source license doesn't depend on how people use it. If you apply the GPL to software for which no source code is available, the GPL is still the same license it always was. -- BenRG (talk) 21:06, 26 March 2016 (UTC)
- The GPLV2 actually does not require giving public access to the code - if you don't distribute the source with the binary, one of the options is to provide "a written offer, valid for at least three years, to give any third party, for a charge no more than your cost of physically performing source distribution, a complete machine-readable copy of the corresponding source code". If you cannot satisfy this offer, you are in breach of contract. --Stephan Schulz (talk) 11:30, 27 March 2016 (UTC)
- Well, my point was that license conditions, such as offering source code if you distribute binaries, only apply to people who accept the license, and if you're the sole copyright holder then you don't need to accept the license because you can't infringe on your own copyright. If there's a contract, that's different, but I don't think the GPL was meant to function as a contract ([4]). -- BenRG (talk) 19:04, 27 March 2016 (UTC)
- My understanding is that if you do not provide the offer (or the source), you are in breach of license. But if you provide the offer, but renege on it, you are in breach of contract. --Stephan Schulz (talk) 21:18, 27 March 2016 (UTC)
- I think you don't understand how licenses work, so I'll go into more detail. If you create software on your own time, by default you own the copyright and anyone else can't redistribute the work without your permission, except for the limited exceptions in copyright law (such as fair use). If you say "anyone may redistribute this software as long as they bundle the source code", that's a license. Its only effect is that people may now do the thing you licensed them to do, namely distribute the software with source code.
- In particular, if you distribute the software without source code, that's fine, because as the copyright holder you always had the right to do whatever you want with the work; the license doesn't affect that. If someone else distributes the software without source code, that has nothing to do with the license, because the license only gives people the right to do a different thing. So the consequences are the same as if the license didn't exist. That is, it's copyright infringement unless you separately gave permission for it or it's covered by something like fair use.
- The GPL is a license. It does nothing except permit people to do things that they otherwise (under copyright law) aren't allowed to do. It's not a contract; it doesn't commit anyone to doing anything that they wouldn't have had to do if the work weren't licensed under the GPL. This is also true of other common open source licenses (e.g. the MIT and BSD and CC licenses). -- BenRG (talk) 04:12, 28 March 2016 (UTC)
- I think we are in agreement about most things, in particular about how a license works - at least, I agree (and think I always did) with your description. But the point is that the GPL does not require you to provide the source - that is just one of the options. It's enough to provide a written offer that you will provide the source. If you chose that option, but then not honour that offer, I think you are in the same situation as a cereal manufacturer who promises "a brand new Ford Pinto for everyone who sends in 3 empty cartoons of Hearthealthy Sugarbombs", but then fails to hand out cars. That is, he is not in breach of the license (he did provide the offer), but in breach of contract (or is this "breach of promise", which I think is more specific?) with respect to whoever wants to take up that offer --Stephan Schulz (talk) 12:14, 28 March 2016 (UTC)
- My understanding is that if you do not provide the offer (or the source), you are in breach of license. But if you provide the offer, but renege on it, you are in breach of contract. --Stephan Schulz (talk) 21:18, 27 March 2016 (UTC)
- Well, my point was that license conditions, such as offering source code if you distribute binaries, only apply to people who accept the license, and if you're the sole copyright holder then you don't need to accept the license because you can't infringe on your own copyright. If there's a contract, that's different, but I don't think the GPL was meant to function as a contract ([4]). -- BenRG (talk) 19:04, 27 March 2016 (UTC)
- The GPLV2 actually does not require giving public access to the code - if you don't distribute the source with the binary, one of the options is to provide "a written offer, valid for at least three years, to give any third party, for a charge no more than your cost of physically performing source distribution, a complete machine-readable copy of the corresponding source code". If you cannot satisfy this offer, you are in breach of contract. --Stephan Schulz (talk) 11:30, 27 March 2016 (UTC)
Dual-channel memory and CAS latency
I've searched for the answer to this but couldn't find it.
If a Windows computer has four DIMM slots in dual channel memory, the DIMM slots are in pairs, making a bank. I know that the two DIMMs in a bank should be the same. But what if you have DIMMs with one CAS latency in one bank and a different CAS latency in the other bank? Doe each bank run at its highest speed or is everything slowed down to the slowest one? Bubba73 You talkin' to me? 19:00, 26 March 2016 (UTC)
-
- Three are Intel Q67 chipsets (Sandy Bridge) and one is an Intel P75/B75 (Ivy Bridge). Bubba73 You talkin' to me? 20:08, 26 March 2016 (UTC)
On a computer that had four 1GB sticks with 7/7/7/20 CAS latency, I replaced two of them with 8GB sticks of 8/8/8/24. Speccy still shows 7/7/7/20. Bubba73 You talkin' to me? 19:39, 28 March 2016 (UTC)
Network speed test says my download speed is 2.10 Mbps but my torrent download speed is 280kbps max
Anybody know why? — Preceding unsigned comment added by 2.103.12.77 (talk) 19:07, 26 March 2016 (UTC)
- Things like that send at a very slow rate. I tried one once, and it was going to take 40 days to download, so I gave up. Bubba73 You talkin' to me? 19:27, 26 March 2016 (UTC)
-
- Torrents speeds depend on the speed of the people uploading it to you (seeders), which can vary wildly. You have a higher chance of getting a higher speed with more seeders. In regards to the speed test, are making sure to convert your results from bits to bytes if applicable? In this case, 2.1 megabits = 268.8 kilobytes [(2.1/8)*1024], which is very close to your stated torrent maximum. -- 143.85.169.19 (talk) 19:19, 28 March 2016 (UTC)
PS - be careful with anything on Bit Torrent, etc. Much of it is illegal and from unreliable sources. Bubba73 You talkin' to me? 19:56, 28 March 2016 (UTC)
- And much of it is legal and from good sources. Just as with http, or Tor. --Stephan Schulz (talk) 20:11, 28 March 2016 (UTC)
CCleaner
I recently collected a program file of v5.11.5408 which works without a formal installation. When I re-enter to the browser after CCleaning, my blue links still stays dark blue. What is the cause? Also, what box(es) should 'not' have a tick on? Apostle (talk) 19:51, 26 March 2016 (UTC)
- I suggest dowwnloading the free version of it from their website, http://www.piriform.com/ccleaner and doing a proper install. I've used it on many computers for several years but I haven't seen the problem you describe. Bubba73 You talkin' to me? 22:20, 26 March 2016 (UTC)
How to install Windows 10 via pendrive
- A step by step guide is sought.
- An extremely short video would also be helpful.
Apostle (talk) 19:51, 26 March 2016 (UTC)
- You could follow Microsoft's instructions here (under "Using the media creation tool"). -- BenRG (talk) 21:16, 26 March 2016 (UTC)
March 27
Why aren't computers and phones advertised by GFLOPS and fillrate or benchmarks?
Sure GFLOPS can be gamed somewhat but so can megapixels and many TV specs and those are still used. And phone marketing at least tells you how many milliamp-hours the battery has which seem even more jargony than pixels per second so why not use that too? At any rate people don't have to know what mAh actually means, they just have to know that this number is bigger so this battery is better. (why they don't use joules or watt-hours is beyond me) What is wrong with manufacturers? Why aren't they bragging about how many GFLOPS their phones can do? It could only help peoples' equipment feel outdated so that they buy stuff whether they need to or not. How is calling a improved processor i3 every year supposed to encourage them to buy things? At least car ads tout the tiniest differences in hp or torque. Sagittarian Milky Way (talk) 04:06, 27 March 2016 (UTC)
- There is no standard benchmark for "GFLOPS" or "fill rates" - it depends entirely on the application structure. Also, many many applications have an almost pure integer workload, so GFLOPS is not even a good proxy. Megapixels (assuming it refers to screen size) are countable. mAh are measurable (although also quite irrelevant if you don't know what your phone needs). I can speculate that they use mAh because maybe that is the relevant value - the voltage of any battery has a theoretical maximum determined by its chemistry and layout, but in practice is somewhat variable and typically drops over time. But if every clock cycle transports a (nearly) constant amount of charge, mAh is a better measure than Joule. --Stephan Schulz (talk) 11:53, 27 March 2016 (UTC)
- Maybe you're right about mAh. I had thought they were just gaming the specs to obsfuscate the amount of energy in the phone to make it look better (since my cheapest phone had a lower voltage battery than its upgrade). Similar to phone cameras adding megapixels even after reaching the limit of optics or very reflective TVs measuring contrast ratio in a pitch dark room. Sagittarian Milky Way (talk) 17:45, 27 March 2016 (UTC)
- How do you know the voltage was different? The nominal voltage published on the batter is normally pretty meaningless. There can be minor differences in voltage levels (in particularly the maximum voltage and possibly the cutoff voltage) relating to safety and battery longetivity decisions made by the manufacturer, but often these aren't that significant. Probably the biggest factor is that some manufacturers just plain lie about the mAh ratings although this isn't actually that bad with most phones even cheapish Chinese ones AFAIK. (Third party Chinese batteries are another thing.) As mentioned, the other factor is that it doesn't tell you how long your phone will last in a given circumstance unless you know a lot more and a phone with a higher capacity battery could easily not last as long due to higher power usage even when basically doing the same thing. (Various runtime figures are normally also published although these also suffer problems.) Nil Einne (talk) 16:04, 29 March 2016 (UTC)
- Maybe you're right about mAh. I had thought they were just gaming the specs to obsfuscate the amount of energy in the phone to make it look better (since my cheapest phone had a lower voltage battery than its upgrade). Similar to phone cameras adding megapixels even after reaching the limit of optics or very reflective TVs measuring contrast ratio in a pitch dark room. Sagittarian Milky Way (talk) 17:45, 27 March 2016 (UTC)
- Related article: Megahertz myth. I am surprised I couldn't find an article about the bitwars. I probably used a wrong search term. The Quixotic Potato (talk) 19:28, 28 March 2016 (UTC)
Dual monitors now different resolutions
I have a dual-monitor Windows 10 system. I had a DVI monitor as my primary monitor and a VGA on the right as a secondary monitor. Both were at 1920x1080. I replaced the VGA monitor with another one (with a physically larger screen). When I did that, the primary and secondary monitor designations switched. I got them arranged back like I want, but now the DVI monitor is at 1536x864 instead of 1920x1080! See the screenshot of Speccy. The Advamced Display Settings shows: "multiple displays - extended" and "resolution: 1920x1080 (recomended)".
I've tried rebooting the system and powering the monitor completely off and back on - nothing fixes it.
How can I get both monitors at 1920x1080? Bubba73 You talkin' to me? 05:59, 27 March 2016 (UTC)
- I found that the DVI monitor (only) was set to 125% font size. Setting it to 100% fixed the problem. Bubba73 You talkin' to me? 06:30, 27 March 2016 (UTC)
![](https://web.archive.org/web/20160329220922im_/https://upload.wikimedia.org/wikipedia/en/thumb/f/fb/Yes_check.svg/20px-Yes_check.svg.png)
-
- It's quite a broken design that changes font sizes by changing screen resolution - one of those quick hacks with a "fix later" comment that get's shipped anyways and haunts you for decades... --Stephan Schulz (talk) 17:53, 27 March 2016 (UTC)
- It doesn't change screen resolution and AFAIK never has.
Since Vista the resolution presented to programs which don't advertise themselves (in the manifest) as being aware of the DPI setting are presented with a lower resolution, but the actual resolution remains the same. This was at least partially because probably 99.9% (including Microsoft programs) didn't actually do anything in response to the DPI setting. So to ensure it actually worked the solution was to render the window for a lower resolution and upscale it. The results are uglier but at least they are actually the right size (in terms of UI elements etc).
After the system was changed in Vista along with improved APIs for DPI scaling, things improved slightly but actually not that much. Some programs claimed to be DPI aware, but did a poor job or no job or actually scaling the UI. In some cases (e.g. games, video players and perhaps stuff like Photoshop where accurate pixels definitely matter) perhaps this made sense (in that it was a choice between a tiny UI or a screwed up program). In others, it did not. Many programs correctly don't present themselves as DPI aware so are scaled and will see a lower monitor resolution. (For games, Windows will generally automatically disable DPI scaling in the compatibility settings although you may need to start the program once.) A tiny number of programs actually did decent scaling at presented themselves as such. (I guess initiallythere were a tiny number of programs which actually did DPI scaling but didn't have the manifest setting so were screwed up. Although if they were at all common, I suspect they were detected during MS compabitility testing and added to the list and the Windows scaling was disabled.)
With Windows 8, there was more attention particularly from Microsoft. Also with this and with the rise of high PPI displays, other software developers started to actually pay attention so there was finally some real improvement particularly with web browsers.
Things changed again in Windows 8.1, given the additional support of per monitor DPI setting. Now programs need to be able to handle per monitor DPI changes when moving between Windows. Programs which don't present themselves as being per monitor aware are scaled if needed. I believe the highest DPI at logon is presented as the default DPI. So programs which aren't aware of the per monitor setting but are aware of the general DPI will be presented with the higher DPI and resolution for the monitor which will be needed for that DPI and then downscaled appropriately. So a bit more work and slightly uglier but without the blurryness due to upscaling if programs were presented with the lower DPI and upscaled as needed. (Admitedly this doesn't seem to correlate with the OPs experience so I could be wrong or maybe the DPI setting wasn't at login or something else went wrong.) Programs which aren't aware of the DPI at all will be treated as they used to be. (Presented with the resolution needed to give the DPI when scaled.) Programs which present themselves are per monitor aware will be left be.
You can turn off the automatic scaling of unaware programs by changing the compatibility setting for that program. (You can also change the manifest but that's much more involved.) I think there may be a way to turn off the forced scaling for unaware applications completely but can't remember offhand. (I tried to search but just found people talking about setting to default DPI completely which isn't the point.) It's definitely possible to turn off the per monitor setting and only have a single DPI.
Of course you wouldn't have to worry about forced scaling (or apparent incorrect resolutions) if all programs actually implemented proper scaling but even now many don't and ultimately there's nothing that Microsoft can do about that. (You'd still need to make sure you have an appropriate DPI setting.) Admitedly I think there are still some Microsoft programs which don't seem to quite handle DPI changes properly. In more general terms, one factor is that 125% is small enough that most people don't care if the UI is a bit small. So often prefer no scaling to the unsharp forced scaling for unaware programs. Once you start to get to 150% let alone 200%, it becomes much less acceptable. The allowance of non integral settings obviously complicates scaling, but my understanding is the APIs are actually fairly decent just poorly used. (On factor is legacy APIs, open source APIs and other reasons means there are many different ways of doing things in Windows which are still used for modern programs. It also seems a lot more accepted to use a non standard OS API on Windows than it is on an OS like OS X.)
P.S. I probably didn't explain poor programs well enough. For some of them, they can become unusable because you can't properly read the text at it's cut off by other UI elements. Unfortunately there's no good solution for these programs other than don't use them AFAIK. I don't think Microsoft ever implemented a way to tell this program the DPI was standard and then use the automatically scaling (or no scaling, but this is most likely with large scalings since small ones might be ugly but mostly usable); rather than rely on the programs borked internal scaling.
- It doesn't change screen resolution and AFAIK never has.
- It's quite a broken design that changes font sizes by changing screen resolution - one of those quick hacks with a "fix later" comment that get's shipped anyways and haunts you for decades... --Stephan Schulz (talk) 17:53, 27 March 2016 (UTC)
-
- I think that what happened is that I had 125% size set on the DVI monitor before the change. Somehow when I changed out the VGA monitor, it took over and the system adjusted the resolution of the DVI monitor. Note that it went from 1920x1080 to 1536x864 which is dividing by 1.25 in both dimensions. Bubba73 You talkin' to me? 23:05, 27 March 2016 (UTC)
OFC
What type of OFC is required for 4 mbps unshared uncompressed symmetrical dedicated leased internet access last mile connectivity what are the different options of OFC and how the costs compare.What accessory equipment will be needed at customer premises.I understand a router is redundant though service provider tries to push sell .An acclerating UTM and a layer 3 switch are needed if i want high network security and content filtering.Please advice about different speciall most cost effective and efficient OFC and related equipments.The vendors are advising multimode Tx Rx OFC and says that 6 joints will be required.Pleaseanswer. 150.107.176.227 (talk) 10:43, 27 March 2016 (UTC)
- This is a question for your service provider and your network architect, I doubt anyone here will be bothered doing your research for you.Vespine (talk) 21:52, 29 March 2016 (UTC)
March 28
TFT Defective pixel fairy tales
Articles, ISO-Norms and Shop descriptions all more or less claim its impossible to intentionally produce error free LCD-substrates. But customers actually never "bought" that narrative and when they buy some new TFT-Monitor online and get on send to them with defective pixels most send that Monitor back nomatter technical description of these clearly imply most as class 2 which means if it is a "4K"-resolution model you have to infact expect lots of defective pixels. So i assume most sold TFT's are infact error free because shops and producers know they wont get away with their definitions, theyr ISO-Norm and least with that class 2 categorie they sell most Monitors with. So why are they holding on these fairy tale norms and this silly narrative that its impossible what they infact do every day? Who makes up such silly norms? How old are they and how much further from reality may they be? --Kharon (talk) 14:11, 28 March 2016 (UTC)
- ...Maybe you could attend an industry trade show, like Display Week 2016 in May in San Francisco, and direct your questions to the representatives of major manufacturers and resellers?
- Technology manufacturers rarely like to give accurate information about manufacturing yield unless there is a good reason to brag about it. You might also like to learn about root-causes for defects in display hardware: we have an article on defective pixels that might broaden your understanding.
- Nimur (talk) 15:39, 28 March 2016 (UTC)
- The yield bit is also related to quality assurance and quality control. Again, things that are rarely advertised unless they think it's worth bragging about. SemanticMantis (talk) 21:19, 28 March 2016 (UTC)
- I've seen this pattern before of disclaimers saying customers should have low expectations. One that comes to mind is cable TV contracts which say they don't have to fix an outage unless 6 or more houses in an area suffer from it. If they actually refused to fix outages when there are 5 or fewer houses affected, they obviously would lose those customers, but they feel the need to offer that disclaimer anyway. I suspect that in rare cases both might do what they threaten, though. For example, if the manufacturer and cable company were facing bankruptcy, they might then do the minimum replacement or maintenance required by law. (Another silly disclaimer is software that says "We are not responsible if our software destroys your computer". I certainly hope that isn't actually a possibility)
- As for the bad pixels, how those appear seems very important. If it's just dark, that isn't as bad as being some random color. Also, if there is bleed-over from adjacent pixels, that could mask bad pixels, although it would also make the screen a bit blurrier. StuRat (talk) 15:53, 28 March 2016 (UTC)
- I suspect your overestimating the number of customers who'd noticed or bother to send back monitors with a few defective pixels. And that's only referring to end user purchasers, not counting the majority of displays which likely end up in offices etc. Nil Einne (talk) 16:14, 28 March 2016 (UTC)
-
- Actually i read that Dell and HP aka "office"-manufacturers have a "zero dead pixle" policy or offer one optional.
- Anyway, manufacturers always try and manage to improve production quality over time, minimizing scrap production. I know for shure today industry will fire any manager who isnt constantly trying to cut preventable loss. Also now they do 4k screens on 24" i assume production technology is long past the state it was when these clearly outdated ISO-NORMS where written. --Kharon (talk) 21:13, 28 March 2016 (UTC)
-
-
- One approach that might make sense from an economics POV is to continue to produce them with the occasional dead pixel, if that is cheaper than upgrading the manufacturing process where they can completely eliminate them. But, rather than sell them all to unsuspecting customers, they might inspect them and only sell the good ones. Those with dead pixels, on the other hand, could be donated or sold at a reduced price, with full disclosure that they do have dead pixels. (Selling them all at full price and relying on customers to return those with dead pixels would reduce the inspection costs, but increase shipping costs and customer dissatisfaction and therefore bad ratings.) StuRat (talk) 23:06, 28 March 2016 (UTC)
- There's obviously a big difference between a normal policy and an optional one. If it's optional, how much does it cost and what percentage of office computer users actually take it up? You've provided no statistics to suggest defective pixels are as unacceptable to the ordinary consumer as you implied which is my key point, and your followup doesn't give much more. Also "I read" but where? Here's an example of a Dell pixel policy [5]. Here's a HP one [6]. Neither specify zero pixel defects (although HP has no full pixel). Of course large purchasers may negotiate their own policies depending on their demands and we won't see these.
Just to be clear, I don't know that anyone in this discussion is disagreeing that production improvements probably mean pixel defects are rarer than they used to be, simply that they are unlikely to be zero and that you've provided little evidence for your claim that no one accepts a monitor with defective pixels.
P.S. Dell and HP probably have less tolerance for dead pixels than some less well known manufacturers. Notably those cheap Korean monitors that are all the rage tend to be the lower quality panels (although problems may not just be pixel defects). And standards in developing countries may vary from developed ones, particularly with more mid tier manufacturers. Note also you said "send that Monitor back" yet in a number of places this won't actually achieve anything but waste your money. They won't be refunding you or giving you a different monitor, instead they'll be asking you to pay them to get the monitor back. Or more likely they'll just reject the shipment since you won't get a RMA number. Actually most likely you won't be sending it back but rather you'll have to take it back yourself and they'll argue with you a bit and either you'll leave or they'll ask you to leave.
-
- TESTED.COM - We Uncover the Dead Pixel Policies for Every Major LCD Maker. The Quixotic Potato (talk) 12:25, 29 March 2016 (UTC)
-
- Good article, but note that it's over 5 years old. StuRat (talk) 13:14, 29 March 2016 (UTC)
- ISO13406 & Dell & Acer & HP & ASUS & Samsung (in Dutch, I can't find the English version). Apologies to the Apple users; it seems like Apple does not share this info, although an internal guideline was leaked in 2010. The Quixotic Potato (talk) 13:45, 29 March 2016 (UTC)
- Good article, but note that it's over 5 years old. StuRat (talk) 13:14, 29 March 2016 (UTC)