Got a roof box for my BMW 330i GT M-Sport ahead of my drive to Nadugani

[ad_1]

Thanks to the quick-release design, it takes about 5 minutes to mount and unmount the box.

BHPian graaja recently shared this with other enthusiasts.

Roofbox installation and Nadugani trip

Roofbox Installation

One of my hobbies is Astrophotography. I sometimes travel to places with low light pollution to do some astrophotography. These times, I have to carry a lot of equipment in the car. Usually, I can accommodate all the equipment in the car by folding down the rear seats. But on occasions when I have 3 passengers, the available space is limited and I have to drop one of the pieces of equipment. I have been planning to add a roof box to the car for some time. As I have another astrophotography trip to Coorg coming up later this month, decided to get this done.

Thule roof boxes were very expensive costing more than a lakh. A friend of mine in Coimbatore had installed a roof box from a company called Pentair and it was quite good in quality. The only issue was the clamps had very coarse fixing dimensions and would not fit the fixed crossbar installation on the GT. Thanks to TBHPian “itwasntme”, I got introduced to another company called Farad and their roof boxes also were of very good quality. They also had quick-release clamps with slots that can accommodate a wide range of crossbar spacing. Went ahead and ordered the Farad Marlin 400 Liter roof box. Earlier, I had the Thule wingbars with Proride bike rack. Since I quit triathlons and cycling a couple of years back, I removed this setup from the car and stored it. Decided to use the same Thule cross bars and fix the Farad box.

Unboxing the roof box.

The complete kit – Thule wing bars, Thule footkit for F34, Farad roof box and clamps.

The box opened.

These clamps are quick-release. Just pull the top lever and the clamps open up to be released from the cross bar. Press the lever and the clamps close and tighten on the cross bar. With this quick-release design, it takes about 5 minutes to mount and unmount the box.

Fixing the feet (base for the bars) to the roof.

Fixed the Thule wing bars.

The box installed.

Nadugani Drive

I joined the TBHP group from Bangalore in the Nadugani overnight TBHP meet organised by RSUDARSANAN. I drove from Coimbatore through Kotagiri and Ooty and joined everyone in Nadugani, spent a night in a cozy little resort and returned back the next day. I will share more pictures in the Nadugani drive thread. Sharing a few pictures of the GT here.

A few pictures of the GT during the onward journey. Drove through some beautiful hill roads with mist.

To avoid the heavy weekend tourist traffic in Ooty, returned through Kerala. Took the Nilambur -> Mannarkad -> Palakkad -> Coimbatore route. I, TBHPian 500CountryCruiser in his CLA and Brraj in his Getz drove together during this return journey. A few pictures from this journey.

That’s all folks, I have for now.

Check out BHPian comments for more insights and information.



[ad_2]

Hanle, Ladakh, is India’s first Dark Sky Reserve: How to get there from Leh

[ad_1]

If you’re ready for a spectacular peep into the universe, you may want to put Hanle in Ladakh on your list. The cluster of six hamlets—Bhok, Shado, Punguk, Khuldo, Naga & Tibetan Refugee habitations within the Changthang Wildlife Sanctuary, has just been formally notified as the Hanle Dark Sky Reserve. Effectively, an expanse of 1,073km situated around the Indian Astronomical Observatory is now a sanctuary for darkness, where light is managed so that scientists and astronomy enthusiasts can access the night sky in its purest possible form. 

The orion rising over Hanle, Ladakh. Photo: Sabit Tisekar/Shot on OPPO

To weed out light pollution at India’s first Dark Sky Reserve, there will be several restrictions on use of light including placing curtains on windows and doors, restricting the use of artificial illumination indoors and on vehicles. For a place to qualify as a dark sky reserve, it has to be accessible for all or most part of the year–and it has to be accessible to the general public. 

Why Hanle?

Much before it became a Dark Sky Reserve, Hanle was picked by the Indian Institute of Astrophysics to set up an observatory. “There were various reasons why Hanle was chosen for an astronomical institute. The best reason is that it is very dry and very cold,” says Dorje Angchuk, engineer in-charge of the Indian Astronomical Observatory at Hanle. “If there is any moisture, most of the light gets absorbed by the atmosphere, and very little light from the stars reaches us. Due to the dry atmosphere [at Hanle], the lights coming from far away sources are not attenuated,” says the scientist who has been in Hanle for 25 years. 

Indian Astronomical Observatory at Hanle, Ladakh. Photo: Sabit Tisekar/Shot on OPPO

[ad_2]

Big Myths About Phone Cameras You Need To Stop Believing

[ad_1]



Smartphone rear showing camera setup


© Hadrian/Shutterstock
Smartphone rear showing camera setup

The development of smartphone cameras ranks among the most remarkable innovation stories in recent history. Compared to where we were two decades ago, today’s cameras offer photos and videos that come incredibly close to real life. A quick comparison of today’s midrange smartphone camera with that of the first iPhone reveals stark differences in detail, color gamut, and overall picture quality. Even better, the latest flagships keep us excited about the possibility of higher-quality media in the future.

That said, it’s easy to get the wrong idea about how your camera works or what it can do. On the one hand, the majority of smartphone camera specifications are quantified — megapixel count, sensor size, and the number of lenses, for instance — and it may seem that simply increasing these numbers results in better picture quality. On the other, smartphone cameras are marketed with tempting features like “AI processing” and “astrophotography” that blur the lines between quality and reality. As this article explains, much of what is believed about the modus operandi of smartphone cameras is, well, inaccurate.

Of course, the quality of photos and videos is largely influenced by the smartphone user and environmental conditions. Taking photos in a properly lit setting, keeping the lenses clean, using gridlines, and capturing in high dynamic range are some of the steps that help bring out the best in your camera. But there’s little that can be done to influence the camera’s operation once it’s out of the factory line.

More Lenses Mean Better Photos



Back of the Nokia 9 PureView


© N.Z.Photography/Shutterstock
Back of the Nokia 9 PureView

Over the past few years, there’s been an industry-wide movement toward fitting multiple camera lenses on smartphones. This is not without good reason, as these camera varieties increase the scope of possibilities in smartphone photography. The iPhone 14 Pro, for instance, features a triple-camera setup on the rear: a 48-megapixel (MP) main sensor, a 12 MP ultrawide, and a 12 MP telephoto lens. The ultrawide camera can capture images covering a larger width than the main camera, and the telephoto lens is purpose-built for maintaining high quality at a longer focal distance.

But that’s about where it ends. While it is difficult to achieve all of these with one camera lens, it’s also not the case that engineering several cameras on a smartphone would immediately improve the photography or videography experience. The Nokia 9 PureView perfectly illustrates this point, with its five-camera setup consisting of three monochrome sensors and two RGB sensors, all capped at 12 MP with the same f/1.8 aperture. 

According to Nokia, the cameras were designed to work together in retrieving more light than could possibly go through one lens, and produce “photos with superb dynamic range that capture detail and texture from both the highlights and shadows, plus incredible depth-of-field and vibrant, truest-to-life color.” The phone also boasted RAW photography, which should have been of superior quality given its synergy of five lenses. Indeed, the Nokia 9 Pureview’s disappointing photography prowess stands as proof that you don’t automatically get better pictures the more lenses you incorporate.

More Megapixels, Better Photos



Back of the Xiaomi 12T Pro


© Framesira/Shutterstock
Back of the Xiaomi 12T Pro

Ultimately, most smartphone users just want great photos and videos. Because manufacturers understand this, they will utilize every tool at their disposal to promote their phones as introducing a revolutionary moment in smartphone photography. One of these tools is the megapixel count, which has gone from less than five megapixels on a phone camera two decades ago to 200 megapixels on some of the latest flagship phones.

Simply put, a pixel represents a unit area of information in a digital image. A megapixel — that is, one million pixels — is the foundational indicator of how much information the camera can hold in a shot. Thus, an increase in the megapixel count of a camera implies that it can hold more detail in an image or video frame, which also increases the file size of the media. This is illustrated by the fact that images taken with a higher megapixel camera will retain their detail and quality when zoomed in at the same percentage as images taken with a lower megapixel camera.

But this does not necessarily indicate an improvement in picture quality comparatively. The number of megapixels is distinct from other features of the camera lens, such as the aperture, sensor size, and ISO — all of which play a role in the overall camera output. Just as important with today’s high-megapixel cameras is a feature called pixel binning, by which multiple neighboring pixels’ data is combined into one.

The Lower The Aperture, The Better The Picture



Back of the Xiaomi 12S Ultra


© Zana Latif/Shutterstock
Back of the Xiaomi 12S Ultra

It’s worth mentioning at this point that smartphone users will have different conceptions and ideas of what it means for a photo to have great quality. Some consumers cherish brightness in photos, and hence they would prefer a camera with a lower aperture to let more light in. Others would go for more contrast in their pictures, implying that the aperture would not matter as much to them. Despite this subjectivity, it’s safe to say that smartphone users are all for camera setups that would make them look good.

The aperture on your smartphone’s camera determines how much light comes into the camera to produce the image. It is a hole in the camera lens, as the name implies, and its size is measured in “f-stops” — the lower the f-stop number, the wider the aperture, and the more light is let into the camera. Thus, an f/1.8 aperture camera sensor would let in more light than an f/2.8 aperture sensor. While professional cameras have variable apertures, that of smartphone cameras is fixed (most of the time).

It’s obvious that lower-aperture cameras are better suited for nighttime photos, but the story is different for daytime shots. Reducing the camera aperture increases the exposure of shots, which in some cases could make photos unnecessarily bright. However, a low aperture also creates the possibility of background blur in photos, and this can produce spectacular results if done right. The challenge for manufacturers is to find the perfect aperture that delivers impressively on both sides.

Depth Of Field Requires Two Camera Lenses



 iPhone 7 Plus


© frank60/Shutterstock
iPhone 7 Plus

Having noted that multiple smartphone cameras do not necessarily lead to better picture quality, one of the majorly advertised use cases of a second camera is to create a background blur and accentuate the object in the foreground. This second camera is typically a telephoto lens, but a monochrome camera lens can also be employed. 

The smartphone achieves the depth of field effect by taking pictures with both camera sensors simultaneously, adding a blur filter on the background image, and then overlaying the main image on this blurred background. Although this might feature some inaccuracies in edge detection and softness, the result is generally good enough for the photo to be admired.

In any case, you don’t always need this second lens to create such an effect in your photos. For instance, before Google’s Pixel phones started featuring multi-camera setups, its single cameras achieved commendable results in depth-of-field effects. These flagship cameras were designed with computational processing algorithms to detect the main focus of a photo and blur its other parts. 

In addition, apps like Bokeh Lens on iOS and AfterFocus on Android approximate these effects in taking shots. And if your single-camera phone has a “Pro” mode, there’s a chance that you can modify the focal length of shots and blur everything beyond a certain distance. The possibilities are not endless, but you can certainly achieve the background blur without a second camera.

AI Camera Smarts Always Make Photos And Videos Better



Xiaomi Redmi Note 10 Pro


© Framesira/Shutterstock
Xiaomi Redmi Note 10 Pro

It’s 2022, and the term “artificial intelligence” (AI) has become the ultimate marketing keyword in the tech space. AI photography can mean many different things — it’s not an objective descriptor for one specific sort of feature. The presence or absence of “AI” in a smartphone’s camera setup does not make or break the potential for it to capture top-notch photos and video.

It’s true that AI technologies can optimize image quality by modifying advanced properties of the image, such as dynamic range, ISO, and white balance. They can also detect exactly what elements are in an image, and find ways to modify the photo based on these elements. In some smartphones, these technologies also set up parameters before a photo is captured. As such, they can generate photos that are considerably better than the original shots, while retaining picture quality and sharpness.

The problem is that in some cases, AI algorithms modify images based on a preprogrammed definition of what a great photo should look like — a definition that might not be accepted by everyone. Thus, while some smartphone users may absolutely enjoy taking photos with an AI-influenced camera, others would rather disable this feature. In general, AI camera features will truly come of age when they can learn from the patterns and preferences of smartphone users, and subsequently edit photos to work for them.

Image Stabilization Leads To 100% Stable Videos



Smartphone shooting video


© Chaay_Tee/Shutterstock
Smartphone shooting video

Among the several improvements recorded in smartphone camera technology, image stabilization remains one of the most significant. We can now record videos that account for and adjust to the instability of human movement. There’s no doubt that image stabilization has improved the smartphone photography experience by leaps and bounds.

Currently, there are two most employed techniques for this effect. Optical image stabilization (OIS) works by oscillating the camera lens to counterbalance the movements of the smartphone user. It is achieved using a gyroscopic sensor tuned to high levels of sensitivity and precision. 

Electronic image stabilization (EIS), on the other hand, works by adjusting each frame to fit a control point, which creates artificial stability in the media. Finally, hybrid image stabilization combines OIS and EIS to ensure that captured media is both stable and clear.

Unfortunately, we’re not yet at the point where any of these image stabilization technologies can provide completely stable videos. A close observation would reveal that there are still imbalances in the videos, and these are more pronounced when the user’s movements are particularly shaky. We can, nonetheless, appreciate what is being achieved with today’s image stabilization tech — even as smartphone manufacturers continue to propose new ideas.

For those interested in recording smoother videos, there’s always the option of a gimbal. This handheld device uses gyroscopes to keep the phone stable and is more trustworthy than smartphone image stabilization tech.

All The Cameras Are Used All The Time



Back of the iPhone 13 Pro Max


© Ben Harding/Shutterstock
Back of the iPhone 13 Pro Max

If you currently have a multi-camera phone, here’s a simple experiment you can try. Open the Camera app and, while focusing on an object, try covering your smartphone’s camera lenses one by one. You’ll most likely find that covering certain cameras would have no effect on the image capture, but covering the main camera would obstruct the image. 

If your smartphone has a range of camera modes, switch between these modes and cover the camera lenses one by one. Depending on the smartphone, camera algorithm, and image conditions, your camera might correspondingly alternate between lenses, meaning that your main lens does not take all the shots.

Smartphone manufacturers are consistently seeking to improve the functionality and output of their cameras using both software and hardware measures. The move towards multi-camera smartphone setups is an indication of this. Nonetheless, it’s seldom the case that all camera lenses are working at the same time. Whether it’s to capture normal images with great colors, obtain macro shots at high quality, or optimize photos for smoother editing, the specialization of each camera lens and type is what helps the smartphone user to achieve the desired result.

Even in cases where more than one camera is in operation — such as in the depth of field photo, others are most likely not in use. This further emphasizes the issue of redundancy in smartphone setups with four cameras and above. Interestingly, both midrange and flagship phones feature this multiplicity of cameras while differing greatly in picture quality.

Optical Zoom And Digital Zoom Are The Same Thing



four iPhone camera arrays


© Hadrian/Shutterstock
four iPhone camera arrays

Smartphone enthusiasts might recall the Samsung Galaxy S4 Zoom, the 2013-released smartphone by Samsung that featured a retractable camera lens. We might also recall that Samsung did not move forward with similar devices — cameras that are also phones, rather than the other way around — as it was not in alignment with the industry’s movement towards more portable products. The phone remains a major reminder of the difference between optical zoom and digital zoom in cameras, and indeed the superiority of optical zoom over digital zoom.

Just as in professional cameras, optical zoom works by adjusting the lens to increase the magnification of the object. Image quality is not lost during optical zoom within the set limits of focal length and magnification of the lens. Digital zoom, on the other hand, operates by expanding on the pictorial information being captured by a group of pixels. The fewer the pixels, the lower the quality of the image when zoomed in — until the image becomes pixelated.

By fitting more megapixels into camera setups, smartphone manufacturers continue to push the boundaries of what can be achieved using digital zoom. At the same time, the work on optical zoom lenses for smartphones has not been shelved. It would be exciting to see how smartphone companies navigate through the engineering challenge of installing optical zoom cameras on smartphones with today’s form factor — and if the market will embrace this, should it be eventually achieved.

Smartphones Always Give You RAW, Original Photos



Images taken by the Samsung S22 Ultra


© Gallery/Samsung
Images taken by the Samsung S22 Ultra

In the nascent days of smartphone photography, both the cameras and their resulting photos were of inferior quality. The challenge of translating images from the camera to the phone’s storage was daunting enough, and there was little opportunity for high-level image processing. Today, the story is different, as this opportunity has grown with advancements in software and hardware technology for smartphone cameras. If you take a picture on your smartphone, it’s more likely than not that the resulting image has passed through a series of image processing algorithms before registering it as an image file in the phone.

Furthermore, image processing occurs at different levels. In some smartphones, the algorithms are relatively straightforward: images are modified to reduce noise, increase brightness, and moderate saturation. In others, the algorithms are a bit more complex: image properties are edited based on the environmental conditions of the shot, to create a balance between the beauty of the object and the quality of the image. And yet other computational processes include artificial intelligence and machine learning algorithms that sift through every pixel of the image and, as in the Google Pixel 6 Pro, optimize for skin tone. Certain smartphone companies are driving the development of custom imaging chips: smartphone chipsets dedicated solely to image processing.

It seems to be the case that consumers prefer photos that come out looking as good as possible and require no edits. This raises the question: do smartphone users want original photos in the first place?

Cameras Work Better With Social Media Apps



Image taken on Snapchat


© Araya Doheny/Getty Images
Image taken on Snapchat

Speaking of original photos, the rise and acceptance of social media imagery might be an indication that raw photos are not valued as much by smartphone users — especially young people. Taking pictures and videos using Instagram or Snapchat has become a preferred option for many smartphone users, and some even use these as their primary photography apps. Moreover, given the observed differences between the quality of pictures taken with Instagram on iPhones versus Android phones, smartphone users might be inclined to believe that Instagram for iOS has been built to raise the standards of iPhone cameras.

Well, not really. Besides edits, filters, and stickers, these applications do nothing to improve picture quality. The distinction between Instagram photos taken with Android phones and those taken with iPhones has more to do with software-hardware integration than with cameras alone. Since iPhones are built using one company’s chipsets and operating systems, it’s easier to develop mobile apps that utilize this architecture completely. This is not the case for Android phones where, for instance, the Android 12 OS is installed on a Samsung phone running on the Exynos chipset. Until Android-powered smartphones become unified or standardized across the board, these apps will be limited in what they can achieve camera-wise.

Besides noting that social media apps do not improve camera quality, it’s worth mentioning that smartphone users would be mistaken to base their expectations of future cameras on social media imagery. Whether this is the path for future camera innovation remains to be seen.

Smartphone Cameras Have Surpassed Professional Cameras



Smartphone camera side-by-side with professional camera


© Youngoldman/Getty
Smartphone camera side-by-side with professional camera

As excited as we are — and should be — about smartphone cameras, and as elegantly as they are advertised by smartphone manufacturers, they haven’t quite gotten to the level of professional cameras yet. There’s still a long way to go and some tough decisions to make.

Of course, smartphone cameras are already so good. Users can capture and record almost any type of image and video, respectively, on these cameras. Some of them, like the iPhone 14 Pro, have even been promoted as capable of shooting full-length movies. In addition, smartphones are shipping with ever-improving features for photos, video, and zoom quality. The iPhone 14 Pro’s “Cinematic Mode” brings to smartphones what was once thought to be only achievable on professional video cameras. Without question, it only gets better from here.

For now, though, smartphones haven’t quite reached the mark. DSLR cameras are still more true to life than smartphone cameras, as they are better at capturing the patterns of color, light, and shade in photos. Also, professional cameras provide more accurate control over imaging properties such as exposure, focal length, bokeh, ISO, and white balance. Thus, even at equal resolutions, higher quality images are achieved with professional cameras than with smartphone cameras.

If nothing else, smartphone camera developments are still driven by a mission to bring the full capability of professional cameras to the smartphone level. This threshold might be crossed sooner than later, and this is why the future holds great things for smartphone camera technology.

Read this next: Things You Never Knew Your iPhone And iPad Could Do

[ad_2]

Save $250 on the Panasonic Lumix G100 camera this holiday season

[ad_1]

Snap up a $250 discount on the Panasonic Lumix G100 mirrorless camera this holiday season as Best Buy currently has this on sale for a fantastic price. 

The $250 discount (opens in new tab) is a pretty big drop, right before Christmas too, as it’s a fantastic price for this camera. We like this camera a lot, and not just for its discount as it features in our camera deals and our best beginner cameras guides. 



[ad_2]

Apple should fear the Galaxy S23 Ultra and the year ahead

[ad_1]



null


© Future / Lance Ulanoff
null

If you want the fastest phone, buy an iPhone 14 Pro. If you want the best camera and selfies? DxOLabs says it’s the iPhone 14 Pro. The brightest and most accurate phone screen? Also the iPhone 14 Pro. Now, leaks suggest that Apple’s unchecked dominance could be in jeopardy, as the Samsung Galaxy S23 Ultra could be the biggest threat the iPhone has faced in years. 

We criticize Apple for its walled-garden approach, but that isn’t why people buy the best iPhones, especially not the high-end iPhone 14 Pro. It’s true that Apple services like iMessage, or its iCloud backup, keep users addicted to Apple and feeling unable to leave. If the phones weren’t great, folks would find a way. 

Apple phones used to have the coolest design, but Samsung and Google caught up, and who doesn’t wrap their phone in a case anyway? It used to be that Apple phones were so much prettier and sleeker than the competition, anyone interested in style wouldn’t be caught in public carrying anything less. Those days are behind us.

Apple iPhone still has the best cameras, right?

Apple cameras are still the best, but Apple doesn’t pack all of the best cameras into its iPhone. You can snap the best wide and ultrawide shots with the iPhone 14 Pro, but you can’t shoot a 10X optical zoom photo like the Galaxy S22 Ultra, and you can’t make amazing, enhanced astrophotography photos like you can with Samsung’s flagship and the Google Pixel 7 Pro.

Recent leaks of the Galaxy S23 Ultra spec sheet show us that Apple could fall even further behind in the coming year. Samsung is taking no prisoners with its 200MP camera sensor, rumored to be the highlight of the next big phone, launching in February 2023.

Samsung Semiconductor has been making a 200MP sensor for a year, but Samsung Mobile Experience hasn’t included one in a Galaxy phone yet. Motorola sold the Edge 30 Ultra with Samsung’s big sensor, but we’re hearing from unusually enthusiastic leakers that the 200MP camera sensor in the Galaxy S23 Ultra will be a leap ahead of what current phones can produce.

Is the iPhone still the fastest phone you can buy?

Besides the new camera sensor, we already have a good idea how the upcoming Galaxy S23 family will perform. Qualcomm has been bluntly hinting that it will be the exclusive mobile platform for Samsung’s next phone worldwide, leaving behind the underperforming Samsung Exynos.

Some kind folks have benchmarked the Snapdragon 8 Gen 2 chipset that the Galaxy S23 Ultra will presumably use, and it performs admirably. Using benchmark tools that work on both iPhone and Android, the next Qualcomm platform comes closer than any previous Snapdragon to matching Apple’s current chipset.

It is not quite a win, and benchmarks are a measurement tool, not a complete picture. Still, the gap between Apple Bionic chips and the Android world was so distant in the past that it wasn’t even close. 

This drew more developers to Apple. It ensured fewer complaints of dragging performance by iPhone owners. If Samsung phones run as fast as iPhones and don’t slow down, what will owners complain about? Not much.

What can Apple do to keep winning?

There won’t be one phone to scare Apple, but Apple should be scared. Its dominance in every way is crumbling. In the next year or so, we could see Samsung phones that beat Apple in performance, camera image quality, and who knows how many other new ways? Samsung phones already pack features like stylus support that the iPhone lacks.

We hear that Apple may launch a folding phone, or perhaps an iPad tablet that folds. This is a classic Apple move, especially in the face of a challenge from its biggest hardware competitor. When Samsung changed the market with its humongous Galaxy Note phones, Apple spent years lying in wait before catching up quickly with big phones of its own.

Still, what is the win for Apple in folding phones? Will its phones be the most foldable? How will Apple differentiate any of its new devices in the future, foldable or flat, as competitors finally close the gap on key features and benefits that before went unmatched?

Apple’s walled garden needs a hedge

The walls are crumbling for Apple from all sides. The iPhone is expected to lose its Lightning port in the next generation or so, which will mean the end of contemptibly proprietary accessories.

If your iPhone accessories work just as well with any Samsung or Google phone, you might not feel like you’re stuck with Apple the next time you need a new phone. Losing Lighting in favor of USB-C is great for consumers, and also great for competitors. Apple should be afraid, as another garden wall falls down.

Google is making strides with RCS messaging and other text message adaptations that foil Apple’s iMessage advantage. Even prehistoric features, like phone calls, are seeing improvements on Google Pixel phones that Apple won’t match. Google just launched clear calling AI features on its Google Pixel 7 phones, enabled exclusively by the Tensor G2 chip.

We haven’t seen similar exclusive features enabled by the Apple A16 Bionic chip, just raw performance. Apple has felt comfortable leading the smartphone pack with the fastest phones, the best cameras, and the coolest designs. As those advantages fade, Apple should be worried about remaining stagnant while the competition races ahead.

Before the world shifts on its axis and Apple is suddenly threatened by Samsung, you might want to take a look at some of the best phones of the year, which includes models from, naturally, Apple and Samsung.

[ad_2]

Finding Balance at Yonder – Door County Pulse

[ad_1]

Don Krumpos navigates making a living while making art

On a recent weekday morning, Don Krumpos was rearranging things at Yonder, the Algoma gallery that describes itself as a creative space run by artists. 

Krumpos was coming off hosting his First Friday event, which featured 16 guest artists participating in an auction that raised $1,025 for Planned Parenthood of Wisconsin and $138 for the Transgender Law Center. The evening also included music by Morgan Piontek and drew more than 100 comments on Yonder’s Facebook page.

The gallery is a display space, work space and event space on Steele Street, which is Algoma’s cultural center. Next door is Yardstick, a well-stocked independent bookstore. Across the street is Ruse, a fairly new bar in a 1905 building. And down the block are Clay on Steele – featuring ceramic arts – and LadyBug Glass, which shows glass plus abstract photography and astrophotography.

Art by Krumpos. Photo by Rachel Lukas.

Krumpos is primarily a printer whose favorite work is etchings on copper plates.

“I really enjoy the process, the iterative process,” he said. “You are always changing. You print one, and you make changes on it, and then you print it again and see how it looks. I also like wood-block printmaking and screen printing, which I finally have set up, so now we’re doing our own T-shirts and things like that.”

Krumpos and his spouse, Erin LaBonte, who’s also an artist, collaborate on public murals such as the one on Bayside Bargains in Sturgeon Bay and another that will hang from Sturgeon Bay’s City Hall. Many Algoma building facades also feature displays of their work, and they have established a reputation as mural creators – work that now pays reasonably well.

“I like to create an art space that people can walk into,” Krumpos said of Yonder. The gallery – with its prints, puppets and assemblages, as well as a three-quarter-scale covered wagon – creates some of that.

“It has a narrative to it. We’re building an idea. Just come in and look. We’re always working on it, creating a place [where] you can come in and see that it’s a little different than anywhere else. It’s not going to be a Gap.”

Yonder’s storefront. Photo by Rachel Lukas.

Krumpos lived in Minneapolis from 2009 until 2013, working as a graphic designer and web designer for companies such as Under Armour, Best Buy, 3M, Medtronic, United Health Care and several other medical tech and insurance firms – work he continues to do from Algoma. He moved back to Wisconsin to teach at UW-Green Bay for a year to fill in for a professor on sabbatical, and then he continued to work as an adjunct for several years.

LaBonte taught art full time at Silver Lake College in Manitowoc for eight years until the school closed in 2020 because of the pandemic. She then taught as an adjunct at UW-Green Bay, and this year, she started teaching art at Kewaunee High School.

The two are working out how to create art, contribute to the community, raise their two-year-old son and build financial security. It’s a mix that requires flexibility and improvisation. 

Full-time university teaching posts are notoriously scarce, and adjunct positions are notoriously poorly paid. Although Krumpos has enjoyed working for nonprofits and teaching as an adjunct, he recognizes that that sort of work will never pay enough.

Art by Krumpos. Photo by Rachel Lukas.

“You feel good about it, but you’re never getting over that bump,” he said, speaking frankly about money and balancing the need to earn with the desire to create art. “It’s just, that’s the way it goes. Like I would love the transition to this,” he said, referring to his printmaking, assemblages and puppets. “It’s still rewarding. A hobby.”

Then he paused and reconsidered.

“I’m reluctant to call it a hobby because I feel like being an artist is a frame of mind. You’ve just got to go to work sometimes, and support the things you do want. Then it gives you the freedom to just do what you want and not ask if someone would buy it. You can just do it because you have been bankrolling yourself.”

Krumpos has concluded that with commercial jobs, the rate of pay is inverse to the degree of interest in the work, but he questions how many people can make a living from art.

“This idea of making art and selling it – making a good living off of it – it’s very difficult. I gave up the dream of thinking [that] somebody can be just an artist. It’s very, very difficult. … I think about that stuff all the time, but now I’m taking the pressure off myself a little bit – off any kind of expectation because I really just enjoy my life. I don’t need to show in other galleries because we have our own.”

[ad_2]

The Geminid meteor shower peaks tonight. Here’s how to see it

[ad_1]



null


© Starry Night Education
null

Zip up those winter coats — one of the best meteor showers (if not the best) is predicted to peak tonight.

The Geminid meteor shower occurs when the Earth moves through the debris trail of the asteroid 3200 Phaethon. This year’s spectacle began on Dec. 4, and the number of meteors streaking across the sky has been steadily growing as the days pass. 

Tonight (Dec. 13), it’s estimated that we’ll be able to see between 60 and 120 meteors per hour, or one to two per minute, as the shower peaks. And that’s a very high frequency, as far as meteor showers go. Given all the variables in tonight’s night sky, the best time to look for the Geminids is after dusk and before moonrise, which will happen around 10:00 pm local time, per the American Meteor Society. And it’s pretty much a one-shot deal — while the meteor shower will continue through Dec. 17, the number of meteors drops dramatically each day. 

Related: Meteor showers 2022-23: Where, when and how to see them

Big Dipper, binary star system and more in December 2022 skywatching

UP NEXT

UP NEXT

As for the specific timing of the Geminids tonight, the highest frequency of meteors will actually occur in the morning — 8 a.m. EST (1300 UTC), according to In The Sky. Since the sun will be up then, the best viewing will likely be in the pre-dawn hours. To look for the meteors, all you have to do is look up. Though they’ll all emanate from the constellation Gemini (hence the meteor shower’s name), they can appear anywhere in the sky.

So what makes the Geminids one of the best meteor showers of the year? For starters, there’s its frequency; only the Perseid meteor shower in August regularly competes with that frequency. Then, there’s the fact that Geminid meteors are often slower and brighter than other meteors.

“Because Geminid meteoroids are several times denser than the cometary dust flakes that supply most meteor showers, and because of their relatively slow speed with which they encounter Earth (22 miles (35 km) per second), these December meteors appear to linger a bit longer in view than most,” according to Space.com skywatching columnist Joe Rao. “Moving at roughly half the speed of a Perseid or Leonid, a Geminid fireball can be quite spectacular and dazzling enough to attract attention even in bright moonlight,” Rao added.

But there’s one big problem: visibility might not be so great. There’s quite a bit of bad winter weather occurring across the country, so the night sky will be obscured for many. But even if the skies are clear where you are, the moon is another problem. It’s in its waning gibbous phase at 70% brightness, which means all but the brightest meteors will be washed out by moonlight. 

If you’re looking to photograph the Geminid meteor shower, put down that cell phone camera and check out our recommendations for the best cameras for astrophotography and best lenses for astrophotography. And while the Geminids are best observed with the naked eye (you’ll have the widest field of view, which means you’ll have the best chance at spotting a meteor as it zips by), there’s plenty else to see in the night sky. Take a look at our guides for the best binoculars and the best telescopes to get up-close views of other astronomical objects.

Editor’s Note: If you snap the Geminid meteor shower and would like to share it with Space.com’s readers, send your photo(s), comments, and your name and location to [email protected].

Follow Stefanie Waldek on Twitter @StefanieWaldek. Follow us on Twitter @Spacedotcom and on Facebook. 



[ad_2]

Big Tech Loves Space, But Not Enough

[ad_1]

The space race is on, and the tech industry can get quite picky! For instance, Microsoft, Amazon, and IBM have been investing, quintessentially catering to the spacetech companies, both private and public. Google, on the other hand, is contributing, albeit sparsely. Meta and Apple, are barely sitting out on spacetech – mostly art and entertainment.  

In 2017, SpaceX announced that it is partnering with Google to launch the biggest space project ever, which was later revealed in 2021 to be providing Starlink ground stations at Google’s data centres. So far, nothing big has been revealed yet. Besides Google Sky, a celestial map that shows objects like stars, constellations, galaxies, planets, or the Earth’s Moon, and Rubin Observatory that uses Google Cloud, the tech giant has not made major breakthroughs in the spacetech landscape.

On the other hand, Microsoft has been partnering with NASA for lots of projects since the beginning of space exploration. In 2020, Microsoft made a deal with SpaceX to connect their cloud computing network through Starlink satellite. Then came Google to make the deal with SpaceX for the same purposes. So far, nothing substantial has been announced. 

Microsoft went on to launch Azure Space platform, providing infrastructure for space companies in the cloud. Recently, Microsoft announced that it is democratising the space development industry by offering infrastructure-as-a-service through Azure Orbital Space SDK for development, testing, and deploying space hardware. 

IBM started partnering with NASA more than five decades ago for space missions. Since then, it has been developing technologies and providing infrastructure for space companies apart from NASA. In 2021, IBM collaborated with HPE Spaceborne Computer-2 and ISS National Lab to provide edge computing solutions in space. In 2022, IBM partnered with Sierra Space for building a space infrastructure for commercialisation of low-earth orbit (LEO), furthering cloud technology in space. 

Recently, Amazon took huge steps and started offering ML and computation softwares on LEO for 10 months through a partnership with D-Orbit and Unibap. The AWS in Space project is aimed to further build and provide cloud services for spacetech innovations by connecting with the AWS Ground Station. This allows storing and managing data on Kinesis and building ML applications on SageMaker.

So far, (not) so good 

NASA and ISRO have been the long standing giants of the space industry. But they have also partnered with IBM and Amazon for their data and infrastructure services since the beginning. After Elon Musk created SpaceX and paved the way for private companies to take part in the space race, companies like Microsoft, Google, along with other startups, started making breakthroughs. 

In 2020, Google partnered with NASA’s Frontier Development Lab for upscaling low resolution images using AI. Google has also proposed another project to enable navigation on the moon’s surface without GPS. Looks like Google has been sitting ducks at an enterprise level and not focusing heavily on infrastructure for space missions. It mostly looks to support the mission once it has been successfully deployed – aka ‘post launch assistance’.

In November, during the launch of NASA’s Artemis, the videos of the take-off were simulcast in ultra-high-definition through Meta Horizon Worlds venues, which is a streaming product of Meta that supports 360-degree streaming throughout the world. Though Meta has been making the metaverse look more and more possible, getting into spacetech is still a long way to go as they still do not have their own cloud services. 

Meanwhile, Apple had made no big bids in the space or even the cloud infrastructure industry until September, when it announced its investment plan of $450 million in satellite infrastructure to compete with Starlink. It also plans to invest $50 billion till 2026. Earlier, the Mac Observatory has been using the Apple Mac for astrophotography. Clearly, Google is ahead of Meta and Apple in the field, but they can do much better. 

Need of the hour 

Google, Apple, and Meta currently stand on top of innovations in the tech industry. By investing in IaaS (infrastructure as a service) for spacetech, they can contribute to the development of new space technologies. Many of them that are being developed for space exploration, including satellite-based internet and space-based cloud infrastructures, require advanced platforms and data centres to function, much like how SpaceX is doing with Starlink.

Meanwhile, in India 

Globally, there are over 5,500 spacetech companies and 368 of them are in India. Moreover, there are an increasing number of data centres in India with Microsoft, IBM, AWS, and Adani Groups planning to build, facilitating them for space tech companies remains unaddressed. The Indian space-tech companies like Skyroot Aerospace, Bellatrix Aerospace, and Agnikul are proving to be successful in space missions and having infrastructures within the country can make India ahead in the space race.

[ad_2]

Venus Optics launches new compact ultra-wide lens for Fujifilm GFX cameras

[ad_1]

Venus Optics, manufacturers of Laowa lenses, has expanded its growing line of Fujifilm GFX-compatible lenses with this, the Laowa 19mm f/2.8 Zero-D GFX. At just 80mm in length and weighing 546g, it’s one of the smallest and lightest GF-mount lenses on the market. Inside the compact barrel is a 12-element, 10-group optical stack, incorporating two aspherical elements, three extra-low dispersion elements and one ultra-high refractive element.

Read more: the best Fujifilm GF lenses (opens in new tab)

(Image credit: Venus Optics)

(opens in new tab)

While the lens itself may be small, it can capture an extremely large 110-degree angle of view, and Laowa claims it can do so without generating noticeable distortion. A large f/2.8 max aperture will help reduce depth of field and increased bokeh softness, while a 5-bladed diaphragm can render 10-point sunstars from out-of-focus points of light.

(Image credit: Venus Optics)

(opens in new tab)

Though primarily designed as a wide-angle optic, the 19mm f/2.8 Zero-D GFX can also double as a macro lens thanks to its short 18cm minimum focus distance. The reasonably fast maximum aperture combined with the wide angle of view could also make the lens suitable for occasional astrophotography (opens in new tab). Filter attachment is made possible by a 77mm filter thread, enabling the fitment of polarizing (opens in new tab) or neutral density (opens in new tab) filters for long-exposure landscape shots. 

Laowa 19mm f/2.8 Zero-D GFX

(Image credit: Venus Optics)

As you’d expect for a laowa lens, the 19mm f/2.8 Zero-D GFX is a fully manual lens, with no autofocus or electrical communication between lens and host camera. However, manual focussing is much more forgiving when shooting at shorter focal lengths, and the lens has a focusing scale adjustment system to further speed the manual focussing process.

(Image credit: Venus Optics)

(opens in new tab)

The Laowa 19mm f/2.8 Zero-D GFX is available to buy now from the Venus Optics web shop, priced at US $999. For context, that’s less than half the price of the two widest GFX lenses currently offered by Fujifilm itself: the GF 20-35mm f/4 R WR ($2,499) and the GF 23mm f/4 R LM WR ($2,599).

[ad_2]

How to photograph a meteor shower with your phone

[ad_1]

Spotting a shooting star during one of the dozens of meteor showers a year can be a rewarding experience if you’re lucky enough to see the night sky spectacle, but capturing that feeling with a smartphone can involve some skill.

About 30 times a year, the Earth passes through a trail of debris left over from a comet or asteroid. Some of the most well-known meteor showers have been observed for thousands of years. The August-time Perseid meteor shower was first recorded 2,000 years ago in China.

GEMINID METEOR SHOWER PEAK: WHEN AND WHERE YOU CAN SEE IT

These fast and fleeting showers of light can be easy to see with the naked eye as long as you are in a dark location with minimal cloud cover. Capturing the experience using your phone’s camera is also possible with some finesse, according to Pixsy CEO Kain Jones.

Jones shared some astrophotography tips with FOX Weather to help sky gazers capture the elusive meteor shower.

Photographers set their tripod to take pictures during a cold winter evening in Sonamarg, about 100kms northeast of Srinagar, the summer capital of Jammu and Kashmir.

Photographers set their tripod to take pictures during a cold winter evening in Sonamarg, about 100kms northeast of Srinagar, the summer capital of Jammu and Kashmir. (Photo by Saqib Majeed/SOPA Images/LightRocket via Getty Images)

You’ll need to start with a basic setup for success for any night sky photography. You’ll want to be as far away from city lights as possible. To find the best location, check the forecast. Heavy cloud cover can ruin your chances of getting a shot of the moon or a meteor.

“Use a tripod to keep the camera (as) still as possible,” Jones said. “Be in an extremely dark environment, away from any city lights, away from any lampposts, car lights. You don’t want any extra light to flood into the lens.”

Jones said using a timer can also help avoid any blurry images.

7 TIPS ON HOW TO PHOTOGRAPH THE MOON

“Keep the phone very still,” he said. “Start a timer so you don’t have to touch it. Set it for 5 seconds, walk back, stay very still, and really let that exposure run for as long as possible.”

Astrophotography exposures can run between 5 and 30 seconds, depending on what you want to capture.

Jones also recommends getting in a few practice shots in a nighttime setting before your meteor shower or astronomy event.

The number of camera lenses and settings has increased with each version of the iPhone. The best addition for astrophotography was the low-light photography and RAW file format, Jones said.

The Apple iPhone 13 Pro Max and iPhone 14 Pro have a night mode for low-light photography, which simulates a slow shutter speed, allowing more light in. Users can also opt to shoot photos in RAW file format, which captures the uncompressed image data allowing for more detail in your photos.

“Those will capture every single pixel and piece of data, which can be really useful for editing later on,” Jones said.

The Apple iPhone 11 Pro and later have a night mode.

According to Pixsy, the Samsung Galaxy S21 Pro and Ultra include several lenses best for taking sharp images.

The Ultra 108-megapixel sensor is best for low-light conditions. The Pro doesn’t have the same processing power, but the bright night mode can take good nighttime images.

Samsung breaks down the shutter speed and ISO settings you need to capture stars, the moon and the Milk Way in this guide. Most require a low shutter speed of up to 30 seconds and high ISO.

The Google Pixel 6 Pro comes with three lenses and a low-light mode called Night Sight. The “Motion” mode is suitable for long exposures if you are trying to capture multiple shooting stars during a peak meteor shower event.

Night Sight is available on Pixel 4a and newer phones.

On the Pixel 4 and later, you can capture time-lapse night sky photos.

Open the camera app and tap Night Sight. When your phone is stable on a tripod or other surface, it will display an “astrophotography on” message. When you are done, hit “stop.”

Many photography apps offer more flexibility if your phone can’t adjust exposure and shutter speed.

Most popular apps require a small payment between $2.99 to $25 a year to unleash the full potential and settings.

NightCap is designed with astrophotography and low-light situations in mind.

For $2.99, the app allows you to create long exposures and adjust the ISO and F stop. It even has settings to capture shooting stars, the moon and human-made objects in space, including the International Space Station.

After working so hard to capture the night sky, you want to ensure no one grabs your image off social media and claims it as their own.

Pixsy uses reverse image search technology that monitors images across social media and websites to find where your photos end up.

This technology can be helpful to amateur photographers, professionals or even parents who want to protect images of their children.

“We have plane geeks. We have space geeks. We have astrophotographers. Yeah, we have miniature photographers, flower photographers, medical photographers,” Jones said. “It’s quite interesting when you get into all the different sort of niches of these sort of subsets of photographers.”

Jones estimates there are 7 trillion images on the internet, and it’s just impossible for someone to look everywhere to track their pictures online.

“It’s a crazy amount, and only really this technology can kind of help keep track of where it’s all being used,” Jones said.

[ad_2]