Hey, folks,
I'm reasonably new to Cocoa developing. I'm playing around with a subclass
of NSView which generates some stuff based on the size of the view. The
code basically looks like:
-----------------------------
@interface FooView : NSView {
NSRect myBounds;
}
- (void)doSomeStuff;
@end
@implementation FooView
- (void)doSomeStuff
{
// myBounds at this point is set to {0, 0, 0.0, 0.0}, even if this is
called after drawRect is fired
// This will set myBounds to {0, 0, 0.0, 0.0}
myBounds = [self bounds];
}
- (void)drawRect: (NSRect)aRect
{
myBounds = [self bounds];
}
@end
-------------------------
The weird thing is, when drawRect gets called, [self bounds] returns a
reasonable looking bounding rectangle. When doSomeStuff gets called, [self
bounds] returns {0, 0, 0.0, 0.0}. The extra weird thing is, if doSomeStuff
is called *after* drawRect gets fired, myBounds is {0, 0, 0.0, 0.0},
*despite* it being set to something reasonable in drawRect.
Any ideas what's going on here?
Cheers,
--
Jim
Fear is the dark room where the Devil develops his negatives.
- Gary Busey
_______________________________________________
Cocoa-dev mailing list ([email protected])
Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com
Help/Unsubscribe/Update your Subscription:
http://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com
This email sent to [email protected]